var/home/core/zuul-output/0000755000175000017500000000000015145250702014526 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145261543015477 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000164533315145261430020266 0ustar corecorecikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs$r.k9GfB "f7?_eK|y}%oysシ7޾vZ8)/hxK}l11O6EYn*jn獼خx~̖K^_/_p/Jz\,W]EoO/(̗?<x Ζbx= x% +#4^ 8D^ώI8&xėf9E៾|3FmZl⇓8T*v (6pk**+ Le*gUWi [ӊc*XCF*A`v cXk?`QlrTvb)EZW3)7ɀ;$#LcdHM|J^[^Sg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? r.|I"n j/\U R[EC 7g/7_E'c/z&BBbm1lχtO Ң`?Tӣ 5W=Xz`̷~F<5n|X&p@J$tι#&i 5gܘ=ЂK\IIɻ}b{|;_-i!vg''H_`!GKF5/O]Zڢ>:O񨡺ePӋ& ofEnL!?lJJYq=Wo/"IyQ4\:y| 6h6dQX0>HTG5QOuxMe 1ķ/5^Z-y`)͐-o΁qGWo(C U ?}aK+dLdW3RG؍:-~<*KmrI,7k^i̸.y ^t }|#qgb2oII"9 1"6Dkſ~IoŊW9ȝQEkT/*BR =~*.h4(^&-Wg|]OBoEF^j=į`Pýfr JoL`~}PSSii4ȷT (Dn@6_V3E炱}r4(9izh38u'8KwI~3v4&8[qߏ5.)Q VE JN`:a!KM/9 bKkފE uIo1]ߔr TGGJ\B BR 4rJ:-³|lՐ0A_Fw)(c>b;,ľOv%\ MޠPBUB1J!dߙPRzsa™:']*}EXɧM2@:jʨΨrPE%NT&1H>g":ͨ _ʄKJ=5OͩLH/:;ߡՖQʡCOx]*9W C۳6)SCVOאL*򴆔l=q VJީ#b8&RgX2qBMoN 1ђZGd m 2P/Ɛ!" aGd3RZ+ 9O5KiPc7CF=>l l|fͨ3|'_iMcĚ$Hh|Lzܝ6rq/nLN?2Ǒ|?C@,WѩJ:|nl^/GSZ;m#Nvj{,4xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Zۻ#es=oi_)qb@esQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1On,8(E.*EF e ˒єdH+ ̗&.}M*dE!045Z$.k1`zٮƒVe9f6NŐsLu6fe kىKR%VE@]~9XI썋FqiƸ|b>harNJ(Bň0aV&{H{Ll)HClba1PIFĀ":tu^}.g&R*!^pHPQuSVO$.w ub.:DK>WtWǭK~4@Va3"a`R@g7|y-_J5Ґ ޙ-did˥]5]5᪩QJlyIPEQZȰ<'f %VO"d.wEр%}5zWˬQOS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7b/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRYkfeMuVMy̥Q\ګ1F#șcq##rI$I.im򯚪+}2Q14S`XPL`-$G޽*}w[ #j*ٚ- DIAm<==UF^BcAw`g*7R(#ғ [K&#Mp'XގL=s5Ǜ>Y+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jN\zZ*8/'ck vz(vb$^Nyo$p[DtUCE9sxBVQXodՔz q[*ڔC"1Ȋ-R0ڱ}oF4 3vFf#8^Vє+k@ :)@%9@nA B q 62!/ 6G (" u:)fSGAV(e֖t܁ ft~c.!R0N<R{mtdFdHÃФsxBl] " Δ<=9i/ d ␙F9Ґ)Hnxps2wApP!se]I)^ k?'k:%Ѹ)?wɧ6a{r7%]_Ϧi~ԞnZhubW*IakVC-(>Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?he:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓tcZFz?RE%dN lڢFQ0dc`[ JNMuO3 I VA$I=Jb") w?R"CUL @q`6E)*\̇|P gc?T]xDCUBQERSiCS-1TE)E"O`RbT뢬֟ì=K.Q!R$2Tho$s1O`R<**"tO bqp?PUᶞZ*ɠx: d U L OB1<'<$h8/8W{qq3\}'7^BzH2SRmlNI S7(-/C0W\4"u4"#1rMÎ\ݟ2"?|]\Y!2^WFDGts"!j]bn,iҚir4/ŸfI\{@V\ v^o"ϴ^/ ^w%òDz?#:`>{=0vy?I7q_DZ,¹dYi"=3XxE:s' ;Mm)>,pkӇpפiBNomlf.5怳Dq#yDq2|5]k_j-x*7uB7Q`{?MR7/MRw/rƖn6kW%GE3Y}uՠ,(y1`D=כGtMs=%B-uX}+iqȓnnz瘿DH~}y cXbK"El,?f+K \뷺Zd2B  dq1b\o96" }(uӲlmo8̱y(824i*Ce-Qv]d(.G-8?D oeʈ0Yy]<ribCa`FDXcn>l:ҴkZʛ̅&2y" F\3,)G9+3Y >1B.6fb%PX2Qs}e_eK"`.ͪ*/N.Q9XyUNU-)ֺb/y:Ųa-FIvj?[:Hk^פTU}KROsv[G%i,\.@LlU % VүԴ .(GA/(ؚ's6 -#+xt!ilXfZZ?NHȣ/ِD7׾76 /pG핃T5 SJ,n v;A$,2hDVy)br)gig UWl_3}7Vł6^NaőGӬooGuZ%GE4;AizeCi7p&(IHlsg$gՅqB: c *XtԲb$8>"L͹"=ǹ,L|ZOJwjv<`% ,I;+%7wQ`iO vMx;i YRV( 'U)I"Y$:90q;xldƱQ8w^q*k~\F9a9F„ê襁H<<&G+yzn v.tTWr>r;sF~J2U콘U#dmaZjdYV &$ߤ1]g$DfLy7ΗDА;#HMΪ׾93?d]#@ד 8TTEm\֪{O}tZϡH0J9e R==OYta;">K??&apv7`\!-hn ~˅$G7h=@$M(q~|gyb]lw Q,d]ob NydnDj 㷓8$,]}wS9뜙w8pnyv]#b5)NyyZWiVzB^qe7^,㸄 ΍?M } }:V" KNX 1 ?dqJO@Eף1 ]XR=(J`hR3k6a/xIl'HjzApJ-Dgww, *̱])Gދ* 4<8Mޑ0%y5/0M{t3q)clrDi|~/ JƼP%6?Z+!Ȧzr"#XiPv>-nw@fZk:+Y, iT@?Zn&۹K `Fq= MAFnn)M N9`?n /z4#oy޶Hx{xY-q_*'26e"X+o(o\x2z$ۆJZR~zm߷A7D&&܌U6ٵRdsޢ$ZZq\ (nWX";k5VUlh\_}gmNf 4}}*1?k<؊l{>]We -غ@=`&樂Z@:ǩ}; mHx5="fuZRϡ߾5x ˍsOCG ojO W֒Ȫ-v`S̩\<3u!`Yu3SA9x{L|I4m/0m&R\ g-X N\K U \'eTc),>%~tG kxx%q<_i$*7iǃa̯,XFj3՗؁sh@5lhu~7FE!qtyzɖM?Y2i ۙLiF_z.ͩmcFw)FUGA-js1V}X_#6X޶;Rns0]kg&zeW2c"`K\}glJFհ xΛ$ tOMsU%_0[hET*Ϫ @D,gQRjI)&Ӂe+1\Q lKEh-`Z]}ثtӺOMz Oc؄N%#fD/XJT[RgKuCIɺq4b7k׎aq7+ú\+P"yp)$M_EՌ-bɦ؇.- w;m I$S7ԛRbNccچ*/mzɗx.^2{=jc7}mXx5}?eY$2t-)֛hV]UnsVuҠP)ZaECTa0J8u3TQݣѭi]rmaubrL}kY q+Lq<ˏy`e;qF~1s |; -+<}ݍd[uMKC@ ![k* f1j(o'>ʓܘ[QLu> Snӹ hWYmZ i2KoV!TqjC! Vh&xLJzlIwm\{=-L":Ui9R.LJA)[1Gi^gr-=Me w2*ëKmEbmj鶧y3EZ*hyV}X%;h@-4^=xgBU/"<_:vU4o.4#q<|98 [CÃeF{aз;L^D@^Dd~L^(O#,+ L^aw /z@} }1@#~Z lcg!Fw&r2!y6@}i9M}Rb51 c'uc #@!8{F(l.ĴZDV` {O&"&ZNSZaM)IH {O!G `~<#&"DuTf x1:Nq`JG_Xp"610ƋMC~;9 O I^p;F͙!:qLw)ySׅQ1;^54[Ƥ|)C {>XGZAH30x u/pװoIY8/:'^{2R8ޕ07n+鿂bKAZoqe<#٪̔")Ԓun YVd3oK^ln\ ͮ}u@ٯm 0>Rsi4}#% pR>+?krr(y 瞈<2t.ET7L UGnk}[ ,S׵Пt"\#jJ:?Z|JˏN*o$*.fhu{ڀB-Ka5t$d_=8.S. 1dkI5JޞqM6D% Vn}1` ^{e]9:0Սxt 4)c3r`4x F;)BDU9!b)$vӀtDav@4ΰ%  D^80nMX"xcu;P.` A04Z(::6`] 0Օ@4'g" $gQ3V oF-@LE0}t\w's qث|)fqa] ̅}#L:7 ?0J11DhgP4"!Z F~7gRFA=y FH+@a_4H/WÞxA7WL'0GhJpX_H|`w4bt궐>߲uv \F;ƙHRy]1nى8lw5ۢyA|/PNҷ`m:f_5B.y~!-eg<Ӽ#޿Yk. "Uvu*DfhL$ VqxJ} d!9.9l\*X?6 97ڑ_}.5],ʗW$`,y<#?5kBC KԼ~ O&ݽVRljixl9 YizN5% t05*t[+V:flSZvظW|,2# w9Aq᮲RKj/ WߞQUt W>W #w~œ7`nL%Qԏ-XgE~2Yj^8Lu9"Co}TI<oXN~2#…;㺱G/c?Y})@~5 h a}(CL~Z7l\lDTh~#%(sOd夷sgs1j1`=(ũK}zpa]g0rHxPZV բZԅ(y:&"5:, @Y, 3uF Fy5@Q,IѥE{7,<IQ 0!tY_UPL5tʷ_:rZ^`BƣtZwla{6c77LOw\8n"?`ꛯ_<.n0+EZez޴{ɛz˻Sn5A+4EL8c| ;t,4&|")0Jc?;VlG4 E Ɉ=,Q\xvd} `d|&hP7) '~?=SUF?aƵ|5} 0Ч稒pJ+ih&,Ժ| 6tʼ}\$M=g2ǶԭgB[* c~NjK܆,-խLW[jSDg{-gCs>orZ#U}Gь n>{< ȲDe0QLmO4JlC¼ȱ:.ċG~!Gmv‘ e͉S|R؁WaPM t0V譽,߅ޥ>}P-m|7#shmЬ넰$aշƷaEƷmoϯj(4ipxtRMcmU\W}i); evFaj=wf[Y[Y-+9dpm!틏C4u: )T$a k9ϞxX,vm XRlӋ6(, X&M\//M:7UA>QPkMJwiSFlUf;zԲOiwֵVv]VyZOAP{UP{A'DAuVuv]Py:OAPwUPwAu'DAee ʞ'({lAUAv{v_APwA ?QPAUA v4x 6o瘭CKx 5J79bxLVz{F<$YE^\}6S=hL\gUQtM6*s_= c3c^s8X` t%M1$lR30 \s` Ip n}v GYhޟ9yʬJɇ,_̧SP8'>MA`6W"N`ϯhOmtU7b1&+@ёgb,!>T5 =+y.?q*Mq1t2%/BWuveqjڨnr&xEY?_#"VN-̊of}4E8bG(B{pZdhA};7Bp=C&:"UP IgP !N̴?}::Źq?A-"e0 өOꋨ~yJ8RlGdxkl9&G 8I2,ezxn}]jʼ3㢌SlI "|[d^PeS/]Ӵ&}$?A[kepob Pn{ +.U17xq5hY(6?Uuvph"c3[Pj~_Fq ׳d0{Q$ʦo 2~XL1O <-0XT(|L߻ J` IO19QU5pR>όOR RcXjxא!| g%l>-@+_ooصEs$G\~ ]U,4^DL^ /)wK;R:DP::,$ u8u )r˙]'` eܾ3(rIx~@~]NY:rف­{PPG{e:H""|Y 34S`Q INSiwXD͢ߠ=4(eSݔ&kJ2+*8E{wMk4媪>F;. Aw$N'xZȘ$GK%'wQFs& <@;oIo!Տ:ي}?_A8~JcH,Z*JÄ<{$\{|EmEQɍ1Eʪ30]wJzM hy:V,S{&= m -yWر_9}Q+M#j'Kh9|HC7$|TV_uQ6S#G~zV`Z]8ƛ|C}?~vte$Y] X|Ȩ5B֏S~<6%]rs X}W+͟:7_wtϺޅO/`1gl5Cnͳio?׏/?P/͖P7xyѠZ ~E7G-7?<aPP= &:V;ZkTq< uiVϟw|iLYjaT3%TTS%(Wu~9d`<톊vu>1WF8;6J"D0(.&-cA$Hv-#? "h4xQS #di\c$8PFB]; Nű%Cʠ|ܛEj+ (1E ʖMUSA6Ҭ&zLE%M)l͉;-UY" gE~G3GwxhLlJt)UkL{Yxh?Xs5dWNY3{h-$tT(b]a#aA_Ԙ  +)ROQiEgLl3K~7X,:ڷB8t1޳Y'`s!N>Mel^4Atf3۝ʹ_.VXW#RPxcKN4EOMeѨ FN4%e/T;ƯbG߶M?gA;Weeڤg!{ ESV6iR$iKv_XnZdQMhJ+7c(>h8R$G"qSYΣq5^7wZTLdi[ +Hg[޽ތ2:D@0&G}D(;R-4,Z<ꡁֵGl=1mR9_(`ݬwbA٫rrNjMsJ;b0[N~$XUy;K }ց:?ՖIʣaLLohhQ[b"]e $Sǧgܼk(- QkE72g>P4AҹmWÁ4sqXR1`Ebb24)p|. M[Yr°MT(!(ȨLTXy ۊG5,kVsAcW\ KeRR1&+=$7lUA2p;N8yQv9B;%>VV :FbNB6RZz7Zg1 KtvgQǫYDĖ|܃r 9A)N:cp넷`<(bv IxLfre*KV5н$ 0dXjW \-Bn% ߭e 6eYzb"KXpt  3w\sU:%[YQ.C/,6@t p%?(To)с o1T?Ig'9='{-ymϩ4-; Uc0B?A~(d|'^Psv0Z!W!ZU UX a4F94.) E88 *[Jc8A ?v>nS m_Ub>5oy=՛ 6Ih*oK"֛⋭Wūl,'.m@[cQ>lQM.EGTƆ'F@'J^ E,yS$Ayy]z7cU\`4SZM;z켯x%Aաִ*\ IFfqEvQ+_N䙠~]-J dk%vaprեM'Bj'clkH']Mf 0Z%phJ=EKjHAw˂vW1MqfJ,ׄlnĉ`>cw iuB,$礎Fl#H:^~}ãQ'&v֠% ödJ>'UJ"8Z;&-B ;c@t]:WR$NQ${|cqmlWT9WkL =ڍLxP͈\!OaYɓ <5tҏAnځuu/S|?E1Wu򥭝.CT~*۱ǎd8ͮ~X(lħ K0ԯZoZxw%)ltD+p1Jg> Hwܹ--7 =?Vҽ Ywa5W'k @3{ȱ5eU!"!$} 8*lFܡojڲ\̭BnrY@0m.+l8E+ӂ?Xf7|^G1LҺzHtQ*,c^aޞkr,zq54cAQ5H:w|Qf%DF&s\{/6i%qM ]v/)ncL —$'L@!A wfw,yUm[.( ,㪤h)*p$nVZ}H8B$-C!vG}bhW ,P@APAǂ W"O, q,8b=5f)"\U@|ͧD|M4uqC $bLph݅,<,p}S#v}bZ{/Y|R@8̂ tɦa$ ˞5t9hK0[nAbTs8,BRS7k'ΤO_i-[LQϦ\xp4}$]`=F\He@袻*E+{1G&`N.]:>zK >f,8Zy˳3$r6¥4 o LV"jB+^^zk.Asp:E [ad+E gR2 yc"a۬/^-f8'KT,2Mg{Z']IHDSp,_V,8N&ǖ ԊvRjFb]EI޴܏TU`.{L )u鋤zIg;Ȃt5%i]­eBNlJNxKaQ$W2iw2ɬxXcВFm$\&&G "HWfzKovOsGjF^NrJWL_smKV,a*4'Gp݃A$~S' R4Mba%'32-e '@GDYɢ;:"IJ"8"i΄\E|Ę\|s]TO F? YJSP(et?ԑV$?}ߝ=&E_(0w0Zе lIq쥯1Z1HJtޚjaɽ29%IX|WQO{MH8ۨuYu-ѺjTXI|q$JL^S|"@fs`P{iIsI5Yp6-_ Ms<# ^̸m5E('_/\)QG鮪BWTv'Xp5x' ٌY2-()cwo_Xptj'I5=M+99osڢAC>&óҕɩꉓ8In JʫJz-FJҋ*jr9"ayGڋ0*>8mݾ8/]d1b aT ١-y-"EȒ&fCly iy]!oGW=NK3ri89bPVOVWzrj&hOk-SzfPx?52e]+*aRMdemD"c4SNZ4I΢ƘFh~yC^:꾇4%Ln 9t1-K!H]UxrPbJl9#K\c1>^%:Zj(q1ޕl}[F7!j}YN -ZbO Eȧ֗v;4<=| UnES~T4ż{&N8uEvsR-ԖzBEBuu*kS$c,8NfgmVabY)-31cs4(u':P$}b]KPp^2^x{I唧/,8pn#{'uރ5_ob" EIVSs*! cz6쎸7}k1LnWHQA%rli!t2 ՄBˮҳo/lf: LbXکNOٲ`9c&%.Mq)< ^Yvφe64^obi8cx}1ZFB_1./{۶BvHr{ )I |04ݐo/jCΏ>-6\|UGY(FEy]JC?==x d0# ûfEɌ~)af~rq,}Lˈ@籎<>Ĭ2q?'_r}ڏFjeqI,df:5O @FSFpQ.0T@8e꒳;MaBo4O[ ~7~]o' ՍjvLBxT8<@*>#߮B?~RDƨl}_V~;*1_|@~[)$9=FYx0Yi2gc2xx{3k#RɹH(y9CKZes8RL/6t(:I/7T2P4U`Q3jF{:[qu{:Gſ+1 d.Eq oYoم0o?>:W ׃ .W{ Ϊ(8̆nXyXIP3Vz'~j'Z+<4>-F@~RCY0.J7,iϸc𯆦, {b:*1C9Eyɸ6 ؎/#ؐlb/ z<ɦ*__>GUۃhG7DOzVn㍼^J`%b@c~]X]7u5O$[@?7~1S{qq|8=fh`=B4RxV<;S|1*FilAJ| nw-  b~[M8^,iu'yxQgNQEU0HwBnpUA!mP\t8Jt`?cm%-NRwB;< pVHakw=[96qNU ]t.R.7 QܰeFaԅĤt-]6ăZnEZ.iwH6R ĥ3ڠWY4l$ ujfC/nuoNB`N rjuj4_lM3{QNo/pM|,e#+O(G2֗]{^u{Z'Xj9M0*[SV`lQ"(WJѹ_^Vz=x󻷳X Uݻ`f4թ'@vq Gſf_~q9=UIS'J`2V sfq+ ޏy^ǺJޚ{^Żc=bW'ֳrr2[3/vmq} W@;cwH)/),Cxj/׵z'WE?$;&Eǫ8ƁD* 3ꐳHsJMNM.҂AD{+q8:"R'rmq cc,Kb]ž8*ӎe63Wz6= 5Y!((^ \P߽ŚZ|ı[kS0L""?i2\; n"N1Z3~ߎώsI֚ik\uOF)Xuh#Y֭@`S+ QZ~; 9m߻|Gj}&Y oo i6\I͠v ace\$Uی_m*Xs'8_wȞI6l7K_2 {4G룡 "R<A];#"ɾ^ǥO:-}ޥhwKc@?"h jq?Pڻ|f@ c_(ۍNv3/m'ƇOn5̎(wR< F䘓܃H2NA?4>8B܇x ~$13֘>g*- ϕ:I, 5F_*9VMO }mVqL^]*wf(fGQ85gR1q CVG}f'i|a-dy~0{۱snk ϛOW~tWų尰ϓ|6)"uPssc $-PY)o g8&8m(A4'#C}*H?!Y(FnȶUGCbq86.  _Ӫ՛N+v*\[hə"֌uZ rsJ/ s, nD0=h J "kFێZiFhh4c] h> f{-SeP4N4vpg e6֌Ҍ?JvRd.m:k5J:]Ng- _´Q؃R!48P&B!1*'!Ky~ƳmGfm[hɹ֌u%͐X&Z JŬPZeD0'5\`d)91|shQ;hf lG\+_s_kf4c}`YRW=5kc4=`(=5|8g@J,"q* 鬼o-y/~aqYL8B_w~3p)&nH1WULtĪٍnBk[z0|<8pGq${so#&OJ`}f̖湷=湷`85aO:ϽJ.m[ hKYٗEk̦T G75 D %C+cj8TTծJ4 ";M&ubȔ>fWpmi x&^7HTa^9[ ˽ `3IK9r5AC'qyoȊ2#DN@,x?&]BO0ᱠ^Z˒nfsCa<ʾ~Û:hhZX֖H|5Z?UkMxȨ rQF,}LVb D"@&˛jhF~ $n}*#6kZd[([ Û|\4_bdmxgX3 *0I9%fxnUrl^agѦ}_pYT b g‹"VLSl2ǁa0U@(ϭ1H傓l%HΨlG0]iֈpB뽶{Lx(񆣸30S4ͥᶵ3+FsEB,pIE2n}TzPVfk&pi" nSآ+\+6G%A?:F\1!JП@mo2/%Jj[,3z;sgtb0pUPZk# |#utIzZنFI.1$x,̧1{zE*7.Xk#Cqka0ՙZc 1F0"s慷 <5@) vAę;((y}\:\j`gyh(G sBaFٜ`0t4]al,!<2P@%MyCFA)Mu`6gyW(Yʹ3{>E>-}g5avuGquw ?%+mBu{!I00=KJjE9Y6. +mB8#!C2W`&vRX@wq,Ԇr 'VӤ^Z#Gv\Lr>N`yUhj "(@2C q aj[`EhnrB #;@> DQLu@i Oes;R[@ cFAѧ96n@@ D9PZ+Lʧk icgSZR FhND&L@'$9p`(ZQGw\3+8dP#6N"=Z#5=Rnﵢ)H q``?UNs)֎ &M yI j2)!I<-m#;+q?lq=mo-58VֲӦ)ɶکّ5&hrመS#9b,bq4N$4T-P$!N*)БD`j@]OUawl&Mz^(3"Nl,a")4*Id9jr T .qIRx5%&1Zel$v< h"I@, L`̄TJdaRxm1V0+ zdYh2:Ohͦvu*LIciT,ioU?Lkl FJZ]ب~.Ҋ u.6 ȉ8?~ʖs4& q&N2 >1c!ª% #ݔ 4x0, Bk4a8$:pI8x&H,xQtDǝp2'i$P[نNt7sOo)t^=xs9x go7^З^i67Ϸ8e[/K/$͗5kYF=bVw5ݰfr/ {05:͠UO+. o2FJL$ q <0*م ۃhbo {#.8#Ԅ 1bOca'> _(<ǔ%rp L݃d{tlVFVD,m(Mݩ;hqSDHAmD*}\c E< V҆ɘЄR2n$1A;RUM`.ib<9&L3IL0F @ $yqmJ٣ _P1d2ÄJHG@gNgJ8R2 DtVYdiՊT@BA3;ωj8 ք}]+!-"Q:vL(&md+C)9,aw8pnYZIiP:ޒ "D7z)`Jf:?* `N;6wm0=ㆮf * k"71kLQǢ$byCu$Uʒu:D޺[`m/<6'z[eZ6 )SCJǔpGE+0b/s[4wE1͋`U:<{~a>k3 zÊ l:G8,QʳUw@$w}f_Tw7.[Nait?ow&֜(:0!DAFGŘ1glb+ٱ6[ <F`$uxE7Wu\IT9O^c5aeCF0|URf-PTuKLE:t#\R JE^T9}&$՘ꨏx "RFX\M"JHJ_n-jIw3ɫTTfpyY֛l2˖WW6AYYcE.Xc\N--ggXJ8;0bXV0l:ζ(Q:!;!^"o eQ3N}}bLPZ1w}ܒh0%Q;ۛ;6ZnwQr GpiO8!IyDK34QC[ xW;QMvI઎=ᚥKaeiOS4canj WKhWq`JbHu|CpeZǚikA}o=]+}8e1OOYTcbh"ScJRcmdmܟnR3޽.[ۄyu澛+dƘO0nN“U _>»yd S$8v@}d]Y:Jه >j^@tr~@_`]g[uQOY2ufNww]-봾]r7o䂇SNsRIb?KÏ W}4,y.! ?^#e}7Ӵ'4[^@`q yc dt9[d'7D[?`0 h"8N\\~6$C .Xt\SpLcu+hƯt>Y` se\]dΏ1!dK4?l1K 8/ h컺l*]V c0EsMÔ9@`%UV4 7R`$˨Lm:>́Oa֧i#_nWՌp7p2@JBdiLf 雑*]U(?(0vϳEx(ۃ~GªV Դ/WSnH0oe7;Ԅm|UA_GTq~W/̘ade?|zp(4gs^//=HW{]+55ֺ,Xȭ+y7t-t (.@d_Z4\V,7B+Fg}YfT>$0ڠpo鳁(ng_ƉeT"ҤX,q[AB+I\Qj8R~Q UAJA'mFD%}JX},BVJAoo,gH;^Z! }yi/w9kA,+lMy~L+ɢ[0_s9~U^"U^#Gt6̼4vyv*Z[8U0Ew5A1ʖ&OZ !t 5`v@ʑl=0h Cf|)wHi6F#ke;H &z{͉|o>{}#G ^o;.^ٺ>t>O[^"S)Pc/<D'#J꩗]N +O }DtfYy{vFGh>܂4\e;.߼UțWoBqm9eCx_R2`bh 錢;GCA݂AXgfPvQ sľR1 l!D [qPc#D ^*q3bmBOhS ˊ Ag}8>`+' H|N!Y jgڰ/As\Ty0{v`TiZa4_AJFۋGS vۚT [Zݎ4p4]JX?(L1~+Ê/Y;x+fБEsx |*˛Y.-s~*{7aFԧ |LPZFLkw[E팕"Ϣl:) gV}Ç58|?7=[/ zxR4zZ^8ߥ+3+F_E#}M܌po3myU1Mln̲ Xwb[jޭ]᏿~٘JfmgEVr[%;wb4Cv@ޥC"byxSrU\U1cܢubw3JkF aDxE;7҈06XGh@:-ƊǑb+CcǂЄk|r`~ygg w 3 )(/`hnm2M2r:9.\4J"?_fڡOW .5Po3 wꋹmv9K?Ϫӏ>b?  #nYuƹ\> I:ūimӼ,u'`>~f6^K#cMXyEΦa.rMnܛkbAXjR~yP2Q /A̿`",{^s,>Q.Q0Ŭ+ 79hÝVfYUc5Z3=$[-fo5=R\(YOQ{K%RpP!9 |E 2>[:?P`9;5uIlEUB4x0RP4g N X0qB*#Β@ bg[bfF:5#nv.[[UݲPv'wz*e=="sc%&ypg[(M8]lˢrvgWl+f8(˳+vvή;bgW슝]h]+vVgW1]1h%Ю=wb-Y3c/a2$3T =;`?v>8`\J8AmbH-(!dHDC5Vl2Z͵܋t}9t ]ƶyDl}MdzV}n^vN; *aKoOM"ZJʭGӭzɗ%^Z]YoG+ƀ000`<ȓ0}o4y=gZvnƣ߳!kǑʛ: (f^ДEpL$Q1Y e%5.*X$\9oS|\|Όs4(`Y̕3됌c%9d/_g w?3hp`ǂ7cwH6I̾>+r)(4ZcNeqEW8yZn`݈?DxlRVzM̎l48c2' kNhK5w:asASS6sc(QͪeK0;k3"5ѰėW^>Xo*޽|ŪbLU䤷)ŴЬ :kVi9QKU4քYHΦ-B()EX9TYnEKeRхSH_$-Dž4BS_y53 +̬ҭ8S, F&sḚ^M58\E"k~]m[w.yf-S?Y.sT/XYfEĐS>[ Ku~7#oסY֡&=tyx.%ϩ1yϪn `]b_w«;NI>"@A{v~t=0?8>Jt'Ә͋͋͋͋͋͋G#ff4폧:%RREB-*|Tgw>}\%~ɑΓdg/vQ-=0Ga*(2souϞnWJ~ a&\|1_pj4 q͑}_4mk"mI#;R84;<?,WXa",ED9W&E;ӎ"EC/*381+hb$S5] 6JGV9kDIG*2Th},?dԿ'm=3MFAקMzfl} Ze^314~Gv{{{{{{\~~GSq;+f4}'j OzC7}DQd2UYDS(Qk7WQ?1?Zhp˪i\].j_/_玍+;>K^Tzl?x@7& oG:U޾i~|;x0rcߞ?~ 8y4a=8*ggwx.<_Wmpݎjߦ&/r=g)^[!7@ bVd L%ldr3I{12:㆏ _^Ն^=j{TEg i;A7> >\NwNiIBTIH"J*EF;.@ &Wxn#lqAuớ5,j ß~t9m9AxPeqTlo~z?*3aT^Nگ_~q#7ލ^QE ƾѻuިT3$ܜc)겓KI _f 8od2PX=%=\TV5n3. h!rbu* /Hgʇ?x/ t=;؉-F]u{tCс7|,n`3x6ñ q [XѦ(&MLḥ]tQGtnmM[&fm"g9J^HMG>2⋴!ĕ$]b >Y!: qaY=`$:F֒A۝isJ"%U$]y􊥼0cꀊ>T%&<]z Cu^L&bszN|8;Tzu>}#P|r6XSuOr}TJFSq DQǾ樌i`Zт Bf4tOɻ-).s&C*u%-~FZ_A+#!gYw;⬴ /<,} %3-'Gcy-nXl' Z֚X#F{ MĀqRoc8-#eD ;{v '/b$fϟsR2T$4K֛=||0/ȇe;(NRA PWR#$d1äEC\!.jHȅs'˫ϫi9uhmD3`V/'lڋ@ztll6BFf|i"(HHv[ئέ\JhHL0$OXk)4%aL˞s+̣]b>)@Qq KL42EPHɱ]̣[}@irӚ3`5#A2Z"0K :6!?PBl$o \J3_Adr`}%aQ_%:ңc9BFrOXͮEz@@0j8*:.ωUro`*>]8$uKu礯 gƀ YmqUzV҅ y&(u0ajw!dK -h/YrUAhA ' 8##\>r2 {t4\m'qbA7<~_nOx8ScOz쏳Ɉ&[*`:ңc A  ӱp`x4AUo oӎ}f\4pM#q5Ǚ 6s{3P7z>T,=|Õ?OWl[lC2cr& fO\\;s8|HUeңcC25|,LHem-wiҘ|-gɷ>bݨ [X2CJQMI*4 kp--PL< ѱݩ>rNz5*xo|&õ,F3jɡD1 I,ffZ&!ю>rG'xk 7w̺29hguK/U 'gsrAi;Ḷ[gX#TOxl=:j8.\U(bH@ ]SN}PrC3,W :PC0XHJF(h9JL BA4X笛\bm/=α&6B]H^we8'|ޤYը;zt4|s?'eP'e^hl6e@D c?ztkC,^ Œ+]V}ܸEgTNvp +?U_S^9-@,egg}^^.|Yܧ`7F :h;S^Nt'yRs)#hG2@>a{n霥wԗ/?P`*Ia)_@4>u-c+C3|䌲vJPaw*˗݀Pg+=:=d'̴";"cNirIRnń9)7k )᳒巛d6wA?-|a~TC ڽ23yǥBG`05;iVP^d:6lsOX 0IyXc;>w-:cGi“$TivSH ij$=]㶑EuI%@ u<\7yؗ pF$K߯AJG"h1{ԌzGRAI74ზ/j=+Ao)H`6N ٬|,76d=NhM1MG#9:A|d1#`u@YH`6(KYyns8GTGzp՘fs(пJYJtCm)615/QvY^©4ea؎!,OGВ2*bIf<qa0U^o1[fcY.fl'I>D>bo4T\IWV?Ղdtu@ < )5hf`:2jGzp'5`G/˟WqHe-e[ "l#[Ac?x11<2vX*7(g8+g3C]BS08]źXnxb#vpG4`Lpq$=P4F Drڭ­z٤=X_qyQNö{9 #6`^L0錇LƗ8XY G[جٰ~>]1;#pa Uc~MEB{.%ZMMTFoCT4ꙍќ(KYt.ÇOjqH Ĥ15F0䠐8mɘLE_TA%eotGh^!sa2S֛^2fod03sT6H"4h\Q#XGDͬDllAԊ"f/*Xs0dL:֟̀֟#b׾7(joPFS+VexGԵx ߌ 9`ݍk&IRJs:hYU@aXl*q}ą<`y.'#NfYN9h&;RQyM=~)˥q~xG$C3zkwZGZ LHT<*[5҂#ԉ.{d9S}6nZtܒS1fjI5TGF- "G3;SlVfLHg8 R^*iTh[EH Rk6^룦/8#6n[(gTGd Ĥ` GTC#ʕ 1}|\ KG4Fkh::?6t=B4eSyUWbJG{TGiĉpH),!M <}4@oC%tnVIX~~GG\+Z*.nTؤctMdHL I`1c|5:c;6mi͋p,0֌semc%6Mxu6tg+DD9V6.ɮxZKcamn>6̝Ha1<(X 2$璧r 9GinD«o`e#(|5_Mb7<akj&j1GP&s}lJܙiJRrBsMl"މo,|š"fsyIQP޴L.2e^,j5}B].EXJ {pQ7%VRթS!WϿ/jpR/ߜ zd$ =|Z)|R)5PJEH[9QIYzlbH SUW•I*2+u[4:,d\1EɦK cl q+УuJL4h+f[׋cyP1JR/٪҂9tNIF6'jmrq!q()8@{ud5oVx:)(;5s8&a+xPRI iz83IsjEG+)A㋰jc#^u:ڳ.͠-=~CN*_ƚZo٧Yk`lj%ZIAć|:ͅF^lV7OX-gMCBd'emd]j8-%+fEtLŹ#IIR-rƸQ)4~;Y'~ eowя.cN9sn8~3PƓ<JNb|qLd19S;nS10sě3OFIYM\T*%P&7Q;7 Cj#ݬج`֭<7y2$;)ҚmK~{V5)K)3P'-?P' {¼okWqveʟv7 %Tyj-?3tVB3>;R"rO$?S7-7n}YWf.8}أDŽr’I?C2ͥ0gh4+u~?_<͛gcV{4Dkf‡Ogjz[~5)_2^m/vWjU7'`>ϸM|zQvt;&*͎f>_L~Dfo͡q]˂RP}o2߯zl|yûtϹMEX9aU23 l]¾o׃y{]c١0/bpsøzOz{t~)bt=3r_s_E2 =L%^"[7xZ*cTqU?:z"w܊)Jq}Zo}益KUJvE8߾y4?G0O+{_S73ՔrN@C'|h(^-Q0V0"Q(*9OoD+D1jqd]Xg R$-)aͦj6Sǐ[IbԒDwIxXs⭥(r|**Y<,uowo0]ҪȞPo-X .]{I5eY;1!ńӞ)QUrJhH}_'\Y: {Uø E5"mV96l7g8g4ZRhJHr"s {Y2$.J3&q|ZxMQa pq hƇK u>Pg,])\8G<3D,%:akxsfxaлu_`=P*U* }piPZǡܛ=wZ{VWϏ"{n@Td z=̞oDtu>+ 4&"hDrP%7H qrD̏i_n3^n"*+jW(@1Fm%~f+: Zt(λ;*"Ceq('E(e#1IVZ^ 8%uN_̤7)Eqxf$ q-YA)MTLioZ'IL1u 3-|xڀw10j pFq qZ*tB>IaHjdkpUBz{'*U >,E27|5:>ۡKqh0FuE%!* b å:"0|Bg/LI 6>^vcp,"W_FC4"Z .UU8nLJ,s*εQ9T`* hu,BRj~UBVP*T M\n87}sI"LDHؙX%EIsqJqhis^.Ao؁.jXFtZY11rQ2.&pR"Ĥ{H@2y.ɃYݻ7M07a'F^R VH3E2$Z$ %I)W$<)ԗ5{7XW9dnW[F#☓<9#& ~r 0ź^k{0k;WڞvuPUʈk-} s*Q:S4c kW ;1n>&nj+ xCWx•$ a̢$6`T(O4#Kh$aKK!f qoQ`Gx3S9;WszlA]e=w_1cC7Y?Ѿ :>k1-.r1'],u#?c{:KS@RUE]AjUfhUD{>dvMG<Q<2>T}*N([)uԙ9]?ޢ%ZA/=io#7Ecއ{v| |`Ȓny ->-L2TMVȪbRYUP\6k-K,˺J\ʖ~4 4Uׯ4.`q}1x8|ŏ:δW zWJsnihRfsjM)PoOX(0\DIYUXc-W6@[ݘ{abXL}Qn.n?C(Ɍ0rf"8k˻gi{@MZY,/PY93RZ y(J & E^e3b=DF-H~V\׬D'&/U4ɍ u0hvl9 jFbsݨw ^fdgxyw{wy.~5.pqyq|t֌qsTRtvi/.pqm/FiL*Wj80?WT[\ KP$Bc 3:ϩSQND W>?.^5j?C95eyzbi-~8'- lSmm[51Ȇ) kpC`,rj(W~7!wˌzξmk˷ ~d/Ǖe+=`Ĉ LVKx0'8=lo9yq%%b,&Iއnf'|Bop!hV*2UPVU`II8}fxYǞYݘeߘ1dS7Vd{K8CXCX "qH5rZ$ ~F5vG3+_M1wϿҘ;{<2(9+vd)qq,̮(0Sh^RV]x6dx-m|ltqr?ClȂYa$HB*TV)F"ݚ/.ʜ˘7rs)ڡ1E)(YAvۋsN|vGcUzVzqwNaPzG"Q0L֓ {g EhAOsVe?VuAo!6EJRTh5uImX[#[ 焍 /vsi?7rmW9ٍ\[4fdUFvYi/-tf(\)3WE;JFY]q +~5~7M>v!Vv9iZ}\m'-jↆ_>1ͫ~'W/MƇ=;aӈ_'f77`vf2pp3dszJ|?%COً%mZgq;}lCȋ3` LŁc/~9pO^Oy{l7c. xڶl3.fĘj\AF 4l}:'ڞֺV\D*H챭(w,C 3?v+[Pۢ2a5ԌըW5Ѹũu^ަQ]/">jY`4닶lnl3K/k^v}}y E Ns3@G!,l9A0& y`\YQeAKMyX2 {Z`ZWT昻x?joʘߛrQ>.PQR#S?΂l-b9JX9Nೱw)Gx;,#Y831i0ATdWξ+o!txޤ}Fv l\{[#{#՚!nC:p?:tpULFQ4GV#zY8.Yl9(!U94Qj R>&G}ld)LJ)yd##nt>U)]Agk Χ<D`Y;Sn-tR#执gh|Gp)à l8Ձփ uO~h1Q0Hg@'HV7h:ran&b2"2덈\ճCcՓa\2gV'W9*2hl+NcwalMgu3; -fC/v^)‡Fxg|vyE~mo ]ä9RǛߣz\ !2TW|x]+Ұ =@S ݈D^lo()+/Px3H 4R(0 Sh("/rx ro nۉP&X\W>q|~;=_jӲϭwД5rGK\OhYuūmoV^_z}}nm:W'qhH?TTQ,N *jTUUVK~&FG [7/f.r2f՛Cӏ^tJſo˸4G~~H]~lY#w0?0b=F }ku5za&uK0VaӦ~g cִKhQf'UY?zl R`e#[?2y~`%l[F ~c4Z)*+m`!2g 3, &a"M>yF$:F-y~] (.g~+`sHO}cWX:DQH(tX8Μ6;v(CmtARNbDn}~v|]5ݞ K ^O x[s.j}CiNu8wZQ8m2Xk"|o8A GpwjL~+:ElܣhC:=t2{wy,F2VĆ(i͕QG: 2J\8w`M=wN$0ORjbKY_.XTLOT)GE:w߯ቡԯ\|EjS#܏={#8?*?t$ <AA+spȃbPQ5%a"[ۤ+]11fW :ֻO|+is|dol^Y8%@Er=Ʈ*0)NÆjud.kWzl iF%j=0Mk`XB+ ɕQ]ϖk4P^ite91GRYy''vdr)')Y'EM[N$>&wsjXQ.<,D΁*/s R0R܃J3 FKn4SG{>&%/-U \u9蛃Ty15ӬˏJ*WEy eJЕdF7sH .!&*u2F |`f"Bbmh6;td4;DqjFt\1!ZCQl B _$2,m,3 _K0uKF3'2F~ȭI GC(6VSOU~/5Uz7b_̤b+F?}QGorzygLfk˗cN?zX9 NqrR` gHa/QyAsdӓrFKE./ue?`,a &ȗ_m,0`m8Ȉ3J.i%qp cރ|@EU6[C*uu/YbP.Ϭ£ $Rst|}_ =D=h?ØgvJʟMC$Ryt픋nSg؆eW`*k LNw"*F(:0y aEB6Px$Ļ8r˫xrJeb%\|Be<0PU3LrդX18i8]尶Ҙ{c{.gܾHz2@-U0yg(A8bQќzYyNv9v% ?IQ =Zpqpڵ{Re 穖BQ0셍zdƽAgח-ء1oX*ά$}Qѕ+XL2S N1Y1ۄj(j3zvt_Œ0`G70DR,UaxC Y. CPQӒ&Ac(T#nG# {V$o<'j>5EU~+/QnK:Cm~d1fd|ăH|tH؆næ^Yԋ,=p1 k@)$g 85 `DWH 5X'I`bX(ɁeU/Î#"iALz 8iҡ1ӄ2" sU*:ޓ6r\WA-c|xY ѢH4xT$Yچ<#Uza9)5kJҬ|Kx|ܲjv%_yQ&HW-q8{E9ϡ-wO?aH$c7\nddgr$RhHJ}>韣LsNp7"rwn@SBz8,Hϑ>×NN^8 T0_DyBQd o0\2酉$ ~/qs2d'/wGv/,s!EpoertZoq8amvoϓtyEFav$^hrQv4Ц$Ǡ9= ҈LkB<.hMĒ\TŸ3&H e0"Nʛu50Rn$L+j} SAV] F>} eN{!V$ozo w6l>)P=@O k~bkm-|yTpOA}ߍhXpS#?Ni 9W7WjjcaU"Vz-Y#HFehCUt%[M4 8;l]Rt(圏e-05LD~P o>cܿs0<AǢ7~{seDdGa' ?ȡ[)a9cA tTEEjuW|1 촢ݙPϐ?t7KA籫rl\շ~Ks`.bkw+ZgWG~AH`aNk{d/kЄӋ>~o<$Ez{EwH~s()zһ9&+O%^%jr̒tRt2Ӹ8z>ݿ3_ ;1A( H k̈́a]+_"u\ixn! ̉"E%( 'A DGu6 0s[s,3%S[Lқ5W&F* Q 2JGy @S |1!@$kw~ MUʛdTH rr5Q"":bx;=V .BX$&i۟A3h4/i^ ir5pw R~pػOS?_~;z^v?LYzzaX6_;) eF1|JIas=1AWidTՍF ۲+B[Ƌ9<%x Gch 0^9K9oWr*G  _7r #9se!_ P=vY}4a`4FY)*q!qJi=\qeW4E<94EL*6m?}>@xN(aLb- [Q'6i8]߭q*N:ZXfYj8 eҹ Sj0[LXIF"<&ֱD. e#CH$ԥࡪQዺ홿eE鐋V܀7l?hy$ǑR{4yΖX+),F))J.K#K=A6q|wh9ny(jZhsSĂp0<$$E ZF9]:!9ᘡ)P#A|1)%gvB0nq0-왿e ?|_\9)2h9t8~ E5m tѤ?`0 2~ "B-/iC ء&E5J&B5oCA "lْ -53%Vbמ2rnžV32ae٪!eucRR3F1KUX,'N> - xLXU{J &&/V(7ac!P6m t,loXݞet0ܹ- Ytʱ]d)BhDBֈ36PS ٚeL=>pbH8>9PNغ;lngZ@50ʨ0ZF:9{Zʝ`28_Vqك2W*ݠre5`~{k`|pxN8k@~s' }2:DI,٣w_2Bl~R_0b1!#rpI 8ǜS.O.*'TYL"1A6:/F,vD93uT uJ͎,PXEh1`$([GltŠ㳜+ )KzbTPNs!B:`Z5թaG-#ĜKlӃϢHl6"Y[*'|/IH9HWC&>K6%~mnl{wHg1Ҝ ׃]bqW~s @£(Ehq@lPZl! $%Ȅ)@븀2rh2b& -|QE~6m LUd^,Jrs@}1i){dr1 ̄ U=dœER1YFl,#yx.݀~rL-"UZ._>f$Y$qUw8ZmNX6[h9 &Ur6Ui@{8p;V§Hhy` n_Gv[ʣ{0#rb%M`2,R+g37ϚT]6tٮQh"gHrkqP"$vk 85w4e"I2CUrpw"0 A:%4v O/XFZd;̑"ORi ?s[ q q"cJAΑ)D!6 ̆P!:nاl( O`xaA:kbH($<)XkvRw rzS wjs;Ȑ6\ux~J @$|U,oy|* ėef=/Ce@iѳr0>,p+0!+Jb-Q8@֎I Se^tbӐľu7bl@c)@ҽ`EOK4Hp)H,Pa0{d/;ܱ/ʼgZdnjWѐcFtοt@h^-2VQeMUovl%X"gHlvOt•aLęa8]bBdLi6D" \má:l"$!İd+PA Wn13 v!i ֥`L;q1 RT fE=i՝CQW]gsbS#@T6&U6iux:NSjyFH p?EČɲ;7@ x{=%LgʽOW滬SbpSȁ̹`Ҕ<׳x a(ƒ<5?ۤ͹*mѳr6"y+IQJw gHHls]WR=돇"0aO,ޣ\0iұ!Lil"o?$]YjѳrD}_I١$EF oדl_U0~_'PY9x:Qa,#i?v`:EFሬ<);t{Y9A[UX-&*S%Ԃ AbMnǦaP>A=+#<ׯbƚkGz2?)g]P|z3>A=+#R&83}#q,86ٚZQo۳HvKgEF @~}&pX3}}\<@5[de|^8wx%(# )%8LdŘhaf{?WF-2A0^lQAevXڎ"g$`fHD:>rtE"gΰ>E-2zVN)LayӋġEFQ(]\foѳrHtC`lѳrx>KY)k-2zVN}wmh5y"gHA=66ǣ=pܞAjEP!=.%^E*-z ^5tdkW-pXhr'U"$ { x]ٱwG0ol룘=+Gٿ١\ËzEFʑ"*vjYZe=lɘ|CEF Ϊp>e=w>6/x:))~ojv8IǧEFQ &pfAa&j\Xkh65' ]mn5s 1)˄*r^ٓ4Uv_tAx~WǦB1+-A@k4L#y.(&^[2L򫼨~|ٌwUgZOS\*=\Hy4'/in{ Od)Cz?EY~|u .F$^?TW=)tN)_sFѿӇl[TôZkճzU/fAkX|n,o٥Osn}OYs/o' ;<܅./4(ǩTJ3Rs4oYI4d,21C49ѩLwY^{~PWj\%s-]q1.a/ד"֚߮' }hhMa KYg8kb^VyS҃Mj3,.[`u{'iW wF*;ltbZV[, dz^o7?v?:l-sgb1g֧kvjaVOdXǢ+F!,mЗv mC i2(ZLO&6O/KsjNWMSyո|~ЖkcVVMZ3L)H FϠ L!M&`( OU_*0xq=@sހ)s+\29͙9f6fRqS[I1\AыE Ca5@!CÕ33H= Tfrb,5)NZsBac{(}ʨ=&m|.7VSx Z<"mGbF} 8zA3"t:4O]O& F'亸`luyx➞"xQ N7O+G"E ,7LlfSO 7h߃qn99Nc8D FDo戈_"^L0zp|  If E7{$QX!(qJuve%V33rĿ{ߕun!7 (G7E ֡hY̽7_ [/<@~u/eh/cWȺlqٓ:9=>(<~]P~cb ?- ~Suw" TxbqS# ?$C)Snjed(Ck()PuF o)pbMANRfH; +ǟyqΕ?Qp\] \m$8boG`قÈxh`ق{`tEFb G0r6:W_~o3{\#[(O 8\}t{>c{ǖ ƈqEEQa%63P\c h;5A҉D bxfED?\oMtQo3,FW(ZĄȝY&0ju<Z#[!Yc[⡭c9|tG9|tfaBFvԷ,SDadԈ!r#2LVJp(u)h?S`r^LN ;]N.h+ IZTXidB?;>+ǵ=݌.:pYłê/e,9~ߍo&:ox3EvuC$q~eTkׅk?Z>ݿ,aΩ!q^F36TpƓ>[iZ~~mtτA?&'U#\~shXOzpi>0>-vi^2WFѣ_k JS 7~>JǫXC>J"<Ј*tCgi I(2řE~|RU֘1 Cs,>nwyl4~?)x*[bb5&]YM4:4E3EioZKH3$D"r$MeFYFR~:X@[IN =i2J}|Tj$5Px*dj)UTi WF;W2$ͰJTiJŨ4s6ɱ1 4MRC; CdU74։T CTXf-‰9*kdJXkDS|*jؐVl/2LZ!!\d2POU!3MEUS{G%ECv\}lB<$G ,C:2GG#qhGGWg>&e5\EwV r| z6.k/9ú:;ac0龣-gdL{,FIcKǑLDG&#ёDf"zi@O$@gMg_'rKx!1ElSKmZ-ZOneο  ;4:mǫhrgjT#=+A3]VK sן5%x QBJ!Hb'; S]{s*SqX`JUK]t*<=R|~}qeJܙ@?~h4gXzAd/0d0I:{t MRH(ŁqǙ1V;@QHQJ!)^,ùt|naBwow\؇špwG2:|j<ߍVQoE+{Ϗ^Y;LW`|Mv\h |4XF%(-1p9zʽg^B$ηx>@*@@3k8by |H,y4UmΣ&rPQ т D >0HJܧcPdyGy6=?Z*[X?"zF|0 v I7IVR E$Zc_MC!dtpTK$ygZf\dIL&S˧*Ppҵ 4bWz^%v2g9w~i}S\4^9^}sv򁜬??^Yupڀ4nNyo'8,jvPɑ@^!a147"r Ɇ$Ƞ=LK¼鵷d,f=;e62khe{]ӓGQ y8{_W"ӑЄu%Zf+ wm}R՛E˷_ۀߺ'߶^^-/?8ۆ;saOnfVL'_-ڭ3`Ԟ 8{ {f !N~v{N?&zs|1^"NR|zG1LuW;>|njs׬4NȈ4njc}\{{`/A}uͷ/}뷄g8{ۗ`XtG @[]]wJt fӒT/'|H${->\l bR؛__N79kϦy^UT,rV_a_Wex}TSp<ף] MZGm"m}=;Oԧd/G gp (#:Z ov*CA楀%vb\rY Ժ5=r$ u>1tD*Gݽu-qwmӳ8 xcܧ[ыH5;p w1$jM 02Eveqquy$ы5)ۭe d 1{3d^_%/}Fh'eЖ4D$& =NQ{4R'& _sϟEy䚧zYByiMǹ&bbIb.*N_$ց}2Tdv5vs\}wϔJ; ź2׎\S!c@*E v,RbcԖz|t48q(8Cac9 =yn7z:ATJH8/>\{t#0~-"=1^X##E1Nׯ9 µ"geHo}AtTAƠ] I` I]"GAZ(zH*]QsB]\eH% g+KU<&"tJ$g &R0cb)TUkg~13l~r=\_rڶ _#A=)%5"Ujfq,U!1J NpXfq,UjWhNBkWEF5fqTY\5"HKcE0Y\u,UjWY\59B=2z` ^X/K2Z޼^X/_/|fd0@-p`q{O JF4.Xr4cFUHaAk5|z _^W=kgw V'%X:.z`K/G#̈/0\l90/|"b JXDTk9( ~:?LsBhN(뙱1$GMD&$$^.Gtט(P#m+{$DžWnڢ}+bER#-CQ$TBZqKcʹKNQ G 7*f|R:'- y81"Ȣ0xb҇ULpM 1A)FQ), ;<hXO Ê ?:q6}Zw|m4@550iZ9pwۿC97Q ox|$bH+#(e҅8(K]2^Y"p+m$4*EwޞI9%5Nn~g#*jk%Wvߔo@Ffl/,t⁸wq1 €9_ <I˶Js-KtGm,I{vX(7NY &_MǼvq+Ƴ[=(eAwՒiuR;V3trTLyik7_xuk9T嘀X9߯`&rvֿn)˓1d>,b)7)pb62Gq[̿ 74?x\=#Gw᪛?D P_džο5m2is[G?F;fJv0d%{洹5lz^.;P]gkif`2fmrG&w1(iQˍw(۷m>ͶY6dȝ/hO+5Z} g&rvT[Y z*sPmCyh,⁄@8XE{- WlP ޻IjG%%vA}pu.@)jKԫk79;/o/;,+~8?]ƺCa]m |3a0ScN[:z?]B"{p]ni?EOm7kjS?ARkfy۞;@,J <]F>}-%݄tK;1MEVo'_ K1WqBf6;-J'd] aw8ҶQI,᪮&XG`\2t!븓ÆČް}W.ZsLrqG@+#)'Ry*"{!)7Oj {MAěu/xk on-[uhȵD!+BFwYcVKPuݍ˶4C:yFFؘo|o>lJ3!k̿&N6cb]bŁ~;[1tѽ I\~G \lN1?xĈZl}wt@RUo/@v׵2v{ mKmcJI|5@M SG6][g`@oswq6 dav@ 0<ߖg \e%ɂ[Gf^GmSӝ_:]>k˘3xim6:Wɼ:[µn"qQ85yfy=6 #[GGˡ)P/ǬमrUzjeYcVV2YW~Wvoplȃcw7>J%y!n%u>;v6x5Z؄l3dLc!|D\l`  !fe[aN}C]#ErP!Iu2ʌ!~@ %7l뎛Rb(8F9ؚ&8:o;`V˜D9H' vQl}l?"˭R+-i?$b0N>[(A<&0-E I :ZbLY kk-&~H E'.fK)SR#kה!~@ %'>HBe%H0#CZ(B/C<)f#+@yЯ^9-a⇴PxL镧MX]Fs0ڔL[{BMqmO)WC*@bDϧ4Gi\Yx ,OD8QH ȝjUia"$T;j a]`ĶGC?"2HNYr X8C{B*|!쓙l ĕ8=hrYǾ$E[X7D85Jͩ?RܥL`1CSFF@'L"!}LpI/(M< e$׍[`zmJTiQxX)IA#QdYB Sʼn kpHf1(l@'#"h R ͱ儜SX`Qo`\J00dUE:'Ad~@ E7[ep9K1|%AC V{Jϴ)y9dX g t!-  iqUsZMGR~L& 79͵1{B " 0vi<agB +3EDz"G =2?K&JM޳6ncW|n`>vf/袋;܁DIw;c9G~Iٖc"8Enc<'a ktnVsk$9BS;y})DXodظGT4-wYBzRe%JRa1նđ֜TBaK]3d"c)u hLD|biLa >W8A8ˌCf9;zH@ɦ GzWx׹QME6 ͨmEI*]yZ7w)1Ω S Pf"3pYK瘤)&FW%ӺP%"P'@ >BJ; gȞW# /sFK3V?o:{5)0|蟦W.I hoGT/k=Iŀˋz. fxFd'tI nu.Dž--ܶI{tdz{7N}v.\쩯՚`/5bw~􂚜;+3Obk@ lzo^0!]w-ww,|ZjR#?gP.,{ s_rĂV5~jE UD3uh}y /r4S0"P@xUhPaL[LvfBFK.AT~4v^rE%(*Em_*)32/立+_0n\7J),)dAL劼d}P^J. |eKaVfZ\y2!9bDOF=T5tl܀Y4\:o=+U C3ԃmh)%dܜ&%͒9m5 eQa\(Kwyҩ'=C4;)g8qcg [Fy.ە\{JCmJظS8rȉƠgxϵ#<-{} #_SPꁑjccuTsH10!$B1 ^tGdc0'؀4lHwmfl?RޛVE/LhtG63ځ;&= _?ދƔs:(OC8dgؽ//aaV\CPegg`fѻ{O0Xphaj-xU^6Tp ChZui+J.&g̉0dPݙV䑸{16]ٝNVzEDX?AuKa=tfbt\Mdϋ25/A:O?/?Ę[Z.mK/"ҚcD€:d)k?H) 7v^܇ytA 8q0D. Z74YL4d83:` =*z%&lTcL~TgY8+1F%YB}AH9`T`!3sE!M:L.[t!?I԰MhKVe2ʈ}1aZ\K4edhQe5yb.ZσSF on{L!$J mUG G  cŏ-͞!2,6/ 0f4l8W+47ƙcbb ̳_3嘔g cN鐌=Y3%7pBƉ; 0Y\нpG,e[HɚyL ؈՘\:Ra%[zwg*\ I GYG]B.bF{v+'ggbaM)}D%v%iJ/Ӎft {X B⏧~ yhtg"1"t=]e礣0$,qVg>ʳ3c- [`;+&0iʎmΆ+CjNvbɞ=Ά b? )+CI|Hky |a]4/v9`VjtPhkb{>ۋYiG ND øKC-/}AA,niIz c)9x29\TA 9eRky1GO?!dz$de`bNR68G)lu. ,NOI+pX /gTnvS4؄kS#)r @vg۹zM@p{b6}έ?4{Ӹh[(j(?v9S3E;F h db)&y8IGM} YR"y9cOm+dL@|&@A%Ma`O*_4L یh`]IlzE|tCFO$h6jçNe ƟXeR|j?mD=Ugӻq><.ϒ]@6EZ9ve1+> q*ʏh/1}"Z#P|sLլ50j-7#mB\SM?DY"4ooydvajlF r1ͭC`MqI1(%1db9_l* $ ›u3{6uH}yczt.y\Cyț8֒X ZqFP8SCh*8}{8 \ǜn\3M0Dɐ¡ ٰ=5>e(Qf=5OqW#dG0 'z:{H.]1Qq ug1O }P=AT9;{Tpw} 7ayla%B=`zYkF! b3#d$?A G%Ĝg-ߝXtO~D@(qNFZdpB2Û\ :wX1!m}|;zQǟ3S˞3'$eNܙrD“ Κ;+} i[}>knH:MFǔn;1dks扂d` 9\=K}/UϚCZv=YGzdCw,"#GuN3䈀=G *y(4k5O| Xdv9m{֜pX/msV+5?؀v>g-.ٳYH wV?kGa[vTr+w\9ӵY34^/cҟ 'DHg1mqI'uQ di|a2O6⾘̓?^嶸I}_M e5b IJq`sY?Olӻvb=MAReJ(VȔs.H\H(g%5$ɯ7ӻ6(ytIQ5%18OGb*y3O  irg(Oz7s 2L֦Ʊޛ|S?m584'%@ Tnu 3ʃj~4V8[}? T7[QYr.|[0lݗBDh]DuYJoFO`UCt"w3V\t@NW 5G)*:]KQI銶\mh_tjiʕJek,Zd9}ٶ|E--)|I+ن?g#]%ldDslN#xFzz1&)YzsXbŜ#-cG@1VJ0+( tPs@KSV(LS=G/PՍ>crJ`>3g[n/yyEgYYPR DK@5T c2e8gN[dLnJZO 8܅d UB@4U4/Je1٣[ |-ltw%C9QkG uX^JUfu"E&?Nƹmd74+MN*fVV(_O ĺC)%zVQ\~9ѝ(D)fw~9ʮe'nYY:q٩udl=ZZJymfRax^%7Obv?2~[+Tv#}w;+nә󴉜_] ^ӿ <8Wo9 o|l+3ᲄ'3!L(e&:}b8.~xZB8P;Wl{z ]M1<jPOI_^?'ɻO߆7oXf:l Eu2OE~L!"ɬuu2Ohi] Mg6&u،mrD4&?zu4n C:ZnFSN `b\pk𱺝T}Gǹ-\W"{6:DgvQELMʑ6k jϼKVrh:0/^%?JϷUVa&W̦yѤXT $W2'݇ 0T5ͭÍ} \R))c¤!v|u+f% {L5YaCMX5%Exoe5$RjSa ܽ5GlH;M֨ [bޓ~T=]J_[6w-_oђUh*ޱhV\G ~8%^)Jw,.v:5yI蒨p7N ZI-N<E*÷aW>&E=7'FɁ])U-xyzk22 qZ Iv-/am~]_aŊe͹\T.x5hծ⫷ӋddzM.TLeGܵТ^:X̖WM7M^ /eY=\8畽_2gl ˬZ^_p-{.rYTl> D XQQ,,dC3ea=fF{䂉zpJHIoqtɑ`t4՛=y % 'RDZ1J]ӦUK";@f0Tq'7!U{VҞդ? ~Wgakjv^F44sPNb $-+ KlN1P'i lPaWvnD2 pp4Dq6+^q҂CkV)ļu*qg"qJ=DlJVXCVKD&ǕKQZh ApV.q %/ -m&E /lPx 䝘H#_DƜq8N!4$˔Vsy* µ@ޤJRtC3yuR B'39՚)SSL(M =(U`M sE㺰R"xgM@cOL*DH1x .y8,(ZsK%>Áf{s`.Cr>3PHI' J:+FJ;]P>(ehRUn)՟/}\ f^|k:Fq[v6 6A~Dkq(uYnFnA;P\'FᔼLI";e5/Q ] Ђ7 EpW[FW$;!W#c*F \Tyl #'je`{bss2 5 ˴_FشŅ [7wuޓÁ&ff|vv W>4gOwWH]([ѻ#vEDK=VI(ڮbZ+*EWjiMTƒRC2>d'[ITaixIj ݺ PтB{~ZXO ??JggO/sf>_]B:8WW|2W9sk[%[^=1enbF?i[]yA\/%.;{w'/]L^f4'npC7]F˷Oyx]|lsh  lg6Olk8J;Uw7):>~,Y9<5]r wel _-aUPpp[whvbL /a/"k"?DT^=Y'ըs^F+iwE< kX.ߎ4Xi7gD|r&r|:aހt.y(>6=Ltzur 910douYϽ2t03+2~랺VdKK  }ʚOJ4_24)nrm'6ܛa2v\B8B 6xċDC%ŵ z 2Q7Teņ)xoz]KӼZ#n su67KlCVHX<-F e uyi^FieJ\+d <>Z;Y=r~rt?]7'71 X[ D} IIBIzAm:~QELRXO|loƔ֑hj& RT A`Fno!a x'.eѲ2oz4{mړ wtkg_ /Fw  >kEvXG#5c[vk{{rvVlg ѕZ<ԹK2/O٬ɽ,êh]-*`G^&]$2NS0MفrMo YFC"#ID=-ş6S'}책L.'#Nmn;"3uJa|.QVe;H\M4W)$X9'RJ!IpR=|1o*BTMBz_) 1 <y"FʣZ@l2 mSv72itGo+(ǒM^b dYMޙp(gm<3 48iJ4fr޵6cm`Иڷbzw^HJٱc;ʯ_RNbٖle[`]es\Q (2 "4b,Z-bt(-zI7 \H3+OI/ld/[ڢ9q| e0{[ ynf[['.RZ\J[Sb @kW T [O on2 &\qDrA `Ϋ5*pE(l ,ҭ+gX#(!%t](?@m7럗,hE MN o ̅` PEN((:A%)g"gE$[$E/@ 9IpGV`+d=ŐP 8->OA!D۶]̞'WvP(QvhI7tEj5gUM=&+Z*Xy'IȞ"*,`O#4G?BG*΀VѮ;KpfK 0\?KjsoYW6Ǿu)+9p^H-^QTlX"#XEn朕Vr#I%UeT Wܗdo|n֏Q\/z _GQ,8Mykn"XbG$>Hn7`Ovown'^_ A\t I ?D8=)AM0V3H %K0'+0wQ,œ88fGե02>Й]L =K'vO(.DkƤBh QLA5ִf]#!hqx@;`wβPVPɬ ##SB>1ɩ?|?P&R4:H(>"nlp&Ph09 1 <۸VXbF΁PLzG ,q1DPzF3)bf z~olUEAQ;NB Bc\&ڷqVEo>M0!]\@m/23kwu_?l_w|w,-CB-83 *JNʸoTZÇ ^> d9.A4Mg,EXc9a 2f$P$ʂ9> [A.d]0XqOnS'^쀏nNpO>` h(4m[aLȔ™)RLv~ 'b\:SW a@Wo Abt}܁HRDY,O>Lg+BLB*4R "ri$cmYm2A-.Gu; a8`,S:SP) p{[%+0y^P+LRZ7=^ 4^1Rg' 'dn[Y2W-lGCp8)Lۚ%& $5TS F9wĸ s|Z7)+Lt&{A ˲ƯX};F--rDmn[\s qAl s¬ms*_QMz<>јg.e!,V4(NBa΀,P,M9Ec@ !s6{oC%kfl_lY/&l{s~6lKbztڷT &p!r LY E; +na]\DvqģhZx Af4:IAj/,]kN==,dD'^0Y)*)St^M]P.aZ r7;=n +gSnc !ʭ s|T>;y`q*sLКe$}G/='A%~xa`J1`Ǔ׋u"yƭSBf.r4*PCB hg'E.81_m<Ж1 <M-3ppҙXGTǧ9Үmِk=Vg].-HWyyiᇞ퐌511SgكI-ݾ=W}!MvF%ԹT$.&׷=%|jpc9Н&?*i4Z# a7pT{?}AJD ihnKqQ?]ww Q̋LP s 4)O)(|~Hy23qsv;ʴg!O֠  )4W*-Q6w^hkHc G'(SKf{$q/hqo#RAq_P!g\Vt|Fl{:E@;uψ}`v23\푙=00,aRa*@>9& {?L_a y s DQ,B=]=ޥȵcIYߥ-rƺ*%=eбںUCե9+yY|(<*PXwhāa1^;f'{%@)9Lrc \<}(rAm2Le ?.ѧ4\]5cЁr%x]N'l%50EAn  VHuV/fnfuchc-DU~vHۤ]Ry~]y=kʉz֞/ =|ueAZUzd/b dzǯ_?-IN`j5 ƵNpp>ojվY6 큃e'>=t#l<jp4V9`*5}MVevT\̳'%^FBO_sbnCr1y#8`c^@0,N-?03Uicrqt2rQko(v v}W!&tf7]bw8ZA=yVQ#|GB&;l~i]7TGB߉~}ӂ:=RV.ڤpZhw~kC͂bHUdno.)#S rWDK$)^I@d.-y&VDV:u"į]vhT9Jq0Q溤 ~$^6b}=O%7 Rp鳌ZxOmXmyԇh\Yӗ(©CQsRǡ\y{"Ya܏,Oo7t˕W ޽p(5S2FK)&D8n G#1IZP7u!Uw=M2 ~9O;J;3a!a (VEqӄi5k#H"U9ǥ͇H>s ZG"2^AtQ58}do|0%)5M(dKwR+ߌfqʭ.peF Vi&}G`Hz(0yFȈa) %I WRPn`g@sR||%-9$S)/FJL>HŊD m:f*wW=0U"+{; tA'2NKev1 f7!BQ9J{viOo-f{m3 {4i|}P t2[z?7Ȱ} v5&=6?oJ<gz[DZ_ qp_Poݍzz^$JoV$-G(۲{'}]N;*!ӯCv/*y*YR} hr_yWr`ri4W nLð <>#Q˸/T( p_nlДD~"$!e$L0C[N=H}ˌ2'ЪVkJaRLˉ9l#SdRLyN-q 8]Jk\m%VIP8*\%Vu,fd/M B +w,p.M#7N)n D`UbM`hy׿v%MGb$ybLjhh}ۨA[ƌ,f)TZEcMQ 'T;M;:0@1{#9obj㗧* ;* z4`W-T _!G ]T?3v$a OwgaX7'`&>Mn/n_NQ+9 UEo&A3i4Q{jɺ:O.+W?GoSWa'-'oyreVOTr(@y{=:[9K_}dFF&cYf-(bVU(-)E,5!to @`r.G.`5tk$|YSPQ>^9 ## Jĵrl+uQLd4Ql7'O+D JA!xhΩ9AiSS\i}(L mFtw0MS%R}t_8xUCKv >{X- 5R^h=}[`r kOyG^04dRRC:B(ɨH=h`Dǒf*5QzZ֡KЦY:Wf:;$ g*!eAhgd:+nQ SGp>DZR~x u9v4ȧQcG_;CtpUb(_69Vy՞1*^VsRC/3;Z4"I&Oô{+IqU,3nP25=`A!Һ;pp@m*I0UOD"(_ %nj̃` ӷ_ x)tpI $ jh.בw*3^Sv,[!!иE E"K~Ie (sQq?!i djf иdg/歞N[_d|2ۏ'⃤Xs|ґp A54z& ե(9볮o{,uW}g$ )qsu $7U_oMpfyZ7k$ZA>  su"BZ˜.%&qlhld4 HBw1.B#dTQaf06.bVSP4<5"W0ƪcima#m{boyeC{׹̡lLe TSOe?Z_ s_k藷UMmnT s4q#HhV$ &*-O@q<܌jڋ{鲁^ZIJA8$U=g;D?t&V@/rЗfWw׸0^*S/4_@J5I6~ypέUX.i5G .X4gF?{'gF?,lFOi^tǎqS/緥IlO5Ic;{3AypT\61}XDOWW[6&q~t{ȧk5!zˇ$=";Δ2;Bn|'˧91OM nw@ ̡b1+F8=w}XTܡ3 ^Bz; dh:v$6z)yd3|: ^b& fP<&0BXqKR? luEyym԰U c۶sO+MHBtS 4Ϗi%EZJoI m@O?ţS7ARsiàdn1ÈѺ?3T)مp /ũjJ E=R^[`/gz}L?QtT^M^ f}F~t0 ͗i#^kq-ҲsphoSlGA=6{h)vǺ*BL0R:2_)?!6aH`ydMFC$@5*@N`umNQ^:P)Ei`0lR|GNBK_ j [ >u<]_u3^wn$>e\}urv~:p>d\rm'_/Z=L{f)J0P48AT9,R7$)Gyݺ%)a8! IhZrHHQ*ꃃ32g8ܒcyB1ۑ|NdC0 Z!m5㧁 ->Um2NTmLx8XoTbb&X۳:'gs>~DybDhr9T/t9M#pp*fBw1۬-*Ƴhk9-7L0Voи }O7HL€rlsn0Nx]<";H}۽P;r<5|xP=Á}B1&%JLd}(7{fa+= Ǯkm?+3$hB",v cv4 n-ݽvwϥ2nec慫Gp%R@5͓5Iqav6#mbl:|/ gc+)P4{+@(TAr~p3&vM{Zj0-Q3F cTD*N&h-\2VXliJf<aF MB@1I9R8ciK"'_i3D'slRuۊ{A(I [,sN]R(UZISĵ湊cd)VHiTCVSBdOZIh5>ʺLƉDˏ%.FHH|ʰ|9OfF@F4mMq5m1-NLaN\eQX;Ehm  p(I ;TroM}_,A*Q ⎷HjRkl6kOS G$\RBoa: ȘEv'E&ƈҘQB*Ay0!nn<19U5ծBv)vCI @j!ܸ1+ɤ Bu䔧<1JuD2+L&{YPVz;|p tS(ѫE=$NELLQ*%wژduژbicw}8v /y? "'l | e3  %mhR [u+a8pOsa0·PRfY>QbƍS.5Rr#ܫ[!凍E*0ywP"l`%Bx9[8"8E23HƊhYBjqD#®a=\˝+,ɁЫq`P^6<ԟBXgJ#2gww`ߝuR/dάW°C;{K 䕮#g4]/hhղ}1(]rټmT dUv*6TqχɢY[kH(|,^0+iBAqe9wj?Y49m}B` h,7@>cw󣁫_Cv]e5L~ o 54n ©KPJ `" g0.TG/ۏ'rBJ >\uiܠD3F`z>FH&Ki't#qXH!H)L$cA{A*f!@!g1mߧkǿ j@/vԗ2NNSrM@2jsǁh\OczYγ9]pݴnuFW(߽l`S׃Xmcbwk>|ҽwr6dI>Bh4]H8t4,Z|`ky!}/ cÍU1zO#ZI:gQc[!lzY.hL-H^igmX$A!l:N0/^ !Ъs>dT'0p{н!"vܣg"tf=SMCdlhO<5;C3Ng"pPN OP֝>fi&@Zl'%Q)g!, p ÷Lh(7kD9 m,L: S^#4k1TdϮw-9uB9'1gQv,Ji/u\|t0\K#ޣ2h$/3ߚ`̻IT8'y`>%厮m ڊ'kTSoaٕ7hXO"f^[^!,{ߓ1`ZJяdRfTgY{9+Ĉg!$=gG$Ү~|ÞZmX]ϮjxAt8mؕcs(ϼdN{S[.?[TxZ1z=Ɇu<.hyM=mTFdV5@s=Óu%hi_vּ*)NUȿpy6ge/tp aiA6o0t9*qyZȹX3p4ҿ['\H/줋辶ʩ"="I|˱ K6k4S>)Nsq8 Z oŷù Ҵmh?g\dzu%}l9'ql_Rw~ZBp݋9֓km%>~F 4_7)FEIbryɺ}e2;n뮕P ET?MNFB3qp K<9H_: BJpݟlmqdLI|Kl 0E1J"4;bKBI"i>+?;!wwFRH<3[<"+a+(*F.H%{` kIp+af tq<% 0L6Da%rC١{p4O&hx>%^uHmo|SD2ID;KmֳPDft".rIh$;7Կ=FuyyENs@X2_)|Unlhk9$c+, &2M$p/b%?˲Vwv`ˑd5nW=~?8C {͖캿i3>ɯ،HY[<+2m)yX}w ńT\yhӵZ{f"-nWN|iFYy, 8켍DEi#FS#9o1*1SQXcS)}DэSw#}j}y!kl4^5q$R~\qQ~5I-_;%VH 3R3> qx#Ӟѽ{FpHc؞WYS =h#;4iY 3d|| |MLɕe |[^Qv@)6̑ջ X'T>88N7O`wp2 }~607K(;7瓁/Ƿ#xp;Ldfd'b2~ "ۻ/ӻp}oaS ?8}ΔRaM'Y оws5y{m`#􇱛'_wEURA"y~?T:XSةnֻ[pC_W~i(Ρ-xC1GoѾ)l|W#ޅ J* rf527ZSV Ӆ_ `~zʹB,N,]F3"Igrywze v:4 3 u-kfKlOzhhr씂C\KrJ+qWyx)Z[>,7l]oiuYagoIzq))0Z֦Ό&Wᨂo4iM4 GqjQqv%:<=d/hxÁ_PL|Hi)/r "nǷ |j&~"x¸p61{=z{BWA}zN?$A "FJB JiLjh_h{".G#$!f$GVNTZSt:'ϛ8B#c Q̚TťNXhEM.,W$\} &G—N7fC\#$mg6e|%Gy'yޖ0gowO/?;Uoýb]>~8EY  ,U6V>ttBYx5]maA%afq^Ҕ@O Kۇր!53UUoכz[[gMn(a'CH5 v +g1)vR(0O2 .dJy!97#md.`` vfpJqUJB\3 kwx ̐ wۘ4W?ku4ۮ,1rHj[EROuh7'r$uFU 9FغYRRWT7uym[c(ѬFG粊bqb^)N#|u7%Ò'!Q&EOTXbӳӻ;_Lzww.sߥ?ӻ`K+iUɲ`V'(a[+Oe!va%V ӻһvYƑ/ #'/'O}hRYPv4('[,>*nQƷ:nPn˳o$/;~; ~ǰo5r`3#rG ;-sL۷gյe8kѭ;юgsJSuSjk8"xg>'+ >9( R\Xv9AMT%c܇c(gO& gH./U'`qqzͧqxXycsw#%ʈ3$O<ޟ}>7CX!!"{aZkCq?Bƈֳ(GJ ݅ uXjY!U甑Hs?'/@Af'7G#XO|RDPm_Զ|8FQb}Xl8xְ<'1CO-k]ױlx 7<\u4e- eBb1aZຎX@Qu#Eťo >wYoh,fIߏ6NJORc*X^(++,3VVzZ (3i$}bs39uP-c]eӚGB{۝\CXͺHH$FaI5L.LjGEo OUCus%1u]̼/[XI5y9UKu[.Ce~e)HI)[R²pU+nÊT(*0J/P pA&'R2wnw 1^ӻC_EPg^`'oURtH&IC&CH_Q$_M;TFwNM^L 5\1%e,))*KnЊPQ hB"kqmg<M>ڤgȆ zcU:' А#t #b2lTƉ8z;(7k9gd l߷gcg_̿>A-H" /% C2$!2% 93fIᐩw!s`!L ^ #bPz:ۉi"|bF*WO RLRiJq%K@9a*3D.@HM p0V0SJ +ٻ6$W=`6~UUm^)`DGYʢDTEV~qdF|! fDDr̎X2ge/tS >S /4P@`MRZb ![u"ϲBL7Rb٦-;{d}n}Ȉ}i=o5Vcwẽ/w`d-!LYOn6?{:R{W\G>\=bubJzx'y29=OY6m~ŧ/\"="!MpW~_ґtodC/>蹃L˧IXr,`"Qՠ _! . 7eatK||gcrZ/'eAY;s\w<:ʕ[Kc"9HSLyhכ䕅)SF%A,Ĭ X d8q>p&:{7]zjԋYq_)\uA V\]L ikLfyFyBܟᤩFw=M6i43q,kNu8sMQRP߉TBNVͺXB4\UAxU]x᜷D,tYpR(M&J +*8/RAEBDf\!h7˃R@#:v?P 8APW)2ZTD&5fJà?mpj)@oKK ) Fcx"Y ,;(28o(Wj'8ZZ[K:Ogdv.sQ8]rgD*F :5 *tNU$ v2pk;T#- #'y q/fǛrgNѻI.m1\VSy~.ѤqQU!UDՀˑ'pE`?* $hF"],14j w)e{7(G)Jht?>,>wV!h()aUq,KP!1[pdIY~N'GWQ2 O },_=,K7BRP $\$![XAPE0|WtuDb}Ra x3,75yHqlJ'&X.R|*d`Pm }4:{1Vxm5) Rj\ kKN- T1 ;ϚUYnԚ6X\c2EjI%ҦF{)c!>ʟҷyɰ(L%ͬ }FpChVRA  ATHĤ_l/Hi%X4En2SG`w #Fa2Lg#@a| ~r5Ogl䰔´+:1t zdS*r9V6ɇk]Z}Y`{4'*=zSDh2W`Zɺ{q=5iT/s9Q'sedpw?~R7Z`X9}}DZW 5VNroj.wE0h8^Flζa6O[v_'<աBSZ`(uhyrssWaҲ&Z[dr0#5^ٍ+elZ 7a.wcP9%t0JMd,a (fbM+WyuSGxE3d9Կ((.9E&I *Xh1ZCpw#pNg &#g5@gA]puT3WT_\f<-q2 < :V47rYT$6Cˍ7Nh[?|Y>` d6>U8sEf `m$+izPr%88U>)}v߿ەZQerLn3.awˑ֣yq̈9US:AqVxc(:6KQ}v8/sԣLЩ7ƆU&i*-⹫u00 `:1i*drg_Vn`1!`GM"aNQ Q2MmJOەPi t$Gt 9Zf6iNJ Y.V H|[/g=7ْ{F \a`n0XX/s;plid5?+dh39֞;+/K[zއr]8ZK@뼮Jy)_n⥅*y e'ȸ2вuFY=֭3;Xv/=YI~N`֡_RW=zn7mH06[%-'|{Š#v;:kMp)Qϛp u]z ߫!963E-Bڿ.k]_~;܊T"LQ5\0}coo̼ÄP\_53 ӿ[t2z+3a̿\K;vcL]Czjؽ<}ok{bjrWqtۿo?hg::٢=.q[]~ >w#h*HgA8 f<y]H$+DSLF\1ڼ>FhAEZL|av 0-VAtCY[ȓO+:>dz>\azg5S?J*MBˈP8PeP6i'ۄ#bjI 0B+; PCrJ 2aSa2 - DwNcygGԾ~{4K+l vrf(N-x䚫A31HaeaNMsҒF7eog`e.4CE*gX9fWL~ HBgu<+̒og`?Xq΋n菐pܶMf d◻qU‚[K3 %-3 \E! e͐Sf%FK΢򊧷~6]2Iհ`$Yn TU _*xj2aVy3!.ԌtO}]bApvXsupޘ81סGCG,Q8TSn+<-B];@e-ehpLOImFԠ6d+di(kc bBzVKbM6:th(.õUm'zQ%s3JKs& &9*>ADR\`-܌6/ ]]nԪT˭79GQۏ?`JQŻD(?)NFQv꣠h;8GQ.h0Lr[F%v)lG NL@4pa(*M$C"(<~!-]FI0Y0 UȶN(eW[RǖԻ,T{l d cϿ0`bS&qfcAe#v1P{x@-e!h '[밼VL `0&FtS#7B#Kf)Q+Z$r;PtFwh }AQXv CTpwɼ܂"+PZAi +}Sϵ`T1ak fDmx?mB ".;Y [wxiuxcT'\*!0TgۉLIZ$0{"RΘ=իRWyfiL0@hPĂ͹6g7 S^~il`%MǠ)nh4)51OKIؗ9 tdu{Kou ?@q#@h8{HMTEl9 M+^FRR s@)*r++{s?{=l{n>>,dI@ E@PC:+d2+NHAE'R]?&9|#J_xO,ߤIJ,%p $8 4KY,˅` ,Ma9䉴j2犱8Fp?TZ%. ULS(dBLI(i-ORIu97DrJۛ:i 4F ąK]̣J ɤT&TTekmFb0`E/eOI $Hk۱|%lɒ>ґ:[lKBUW,cRm`))cq }h"Ln.i}\ew.兇Dm+9"jA@@" v%$p{ȖǪ0&,5d _֒JIϓ-V#6`l+ˆ9rmt9Ѳ.POȒjU &FA҂}ڹ;[_R9*O5eKXƢ%'ղH/oi! Y%b@mhi~!qHڞǽ327IЂ Tp%@02 Ll!TÔYdV˔l{,)UlC#3f膘qo$П]*2EԬ3 #bLQƛY%ƒQ%Hૉ( ۹ryꌴF?x_zA=:{׷'ZlQX--#X (≊A8=ݒ=%BFSI/5YUV2Ou3`QƐbKv>- ~'!{@p:U:qAznY=עE 覦R[-P3bNgZ] d ll姳N]6OKN:okLTŭi:1+&r=>AA)-|g;|xY ٧29I9XCqW%Mi,ԻqsݘkF*!7sya33e=uΉ6O͋~tȳ rct6xuc,j1Zcb< + N1jw `EEdڰ'㾬![FFֆD ` *źjFSJibF`ܿA)HwY@IR(xc/إeMF*LiNP$0GVD-62f[BV`]glg.(Pn??^7Бj3N\F. .斓^E+EHTs sOL)(V 9dT]iQ&f:ब=rnWȰDph>iu3ES,>k8ijc/ 6̂gީpY@RDEOsĞK!!.DLDّfIH<ƒlCln^E!dYk%0l5NLRXڊQIRW; wd[+_l,ۜZT1Z*zPץ T(}%k4Hv"5&֝nbU=p|&N*Ƒ!c.%0:מ~ĺycu/-MrVc Ԭs}0ߋfMhAz_ 9l2Pb1d)si^K%v: u3~k1>G"lt/Qo:C˟ &A&1O$Z* NW ~׹?ol vɹ{cj~;NoD*hN|GsUz:+N 0 [/8[W\!L͙qƋ_o~)#qTs"ٯq)Oq|3 o>{1# ;r{ er3j!E",N|hlK4#WqWfI1h_&*5Wԙ[vڡ[6Y]0d>!?w9Z$bڨ_q KF,pcoǓ)n4b|sr߂HygƓMz*MG߂gʭJOZ[8<؂ K9RT ڇܱT r\2"h}h*z{#T$|i8tn3x·pc39>g|;Ҏi+`3ڷV1+^p$sV=e=_H̤H'QYǂv棾MǺ-8x6}֫AgW:#hg"@8*+htKX*鴨eIk}T%w`@T PUZL+j>$F,K$u֛{F45WIb苎~I?{;6ctoK?&;;2{yͭ^W9ZDx8pR}?X;6-_|5o[Osf@k CH>VF[+ZqZZr); ZuL ιl~+E*$>.l)'G6Ŗj:*jTZ%AJ>^^<~C.ti4?ְ+^:0lU,0s})a*2QZ%>e-ٺhkdd1+ck-x5մk5iv}8" zEҙUP+0yKq=V*\4s2OGa^|,|:إ=JR[±l V50bw.Q>esjNtAE*G1ȡ=N)֡0cV2iN,9HW矅JdC2޻0h./_sfw[y}c! wa_s3bqԌۙ^7XúyobIh4W1[HW~h},HF8`ܛ: @2T)NਚZʭl[PĪ\7 `NRElu@v,dJ>Xk G%-rkrKn=^46/՟]Uy .&5Vl>Zy=.Q1]>\32[A܌)hDA2j֣[#.^ ShBZ{HfA ޓrʳUaK^${&:O>ZZu패3o{ku[z붫7 GD{Z|hEc1ӠU!/^6# Ѓ9jNqm6P SԬ 3´PR>(,KlZ}$pqwikRǯ @* xR?M4*ή^0yȠ1hH'O$dg];C\m;%Ie7͓NQ \sݓh0U&'VQ97);.~Т<%l1UޟHF|= c |țDETWu|whԢX q64ȨpDj1Zd#OÅ!~+brQꔬJs(yi5U6V§?{g،B>+yf)eא٭H1K=Ʊ,fTf12Y̴2^QLdT>ל*E˗%yt d8+\PL춆l>?c.ܐ_(n\鋔9s(b<7Ә`{AVs@;TCNlqt\U`mk8n(#~w{I&MDG쿯yR+ky(_o SF3j7]J*,mn(1Oy |4jhRwNly{2I~AE!:˞wWGks~aXH,V ؕ# -t*k=5?J 1d$ASY;:Yjk@gr-Z Ĕ %EiUl+ch~"ؿm#yXat(/^a}*02x~+߮Y^{Ls߹_u}/9);|h _,鋱Ѕ\ }y+|&`XmhRQ+?+o*52mB+Ɂ`Go7vʬ\ |Gxn4e\џˌ\fڟkn%ai+ YO[z* (@P+Y\ .Rlnɘ+Үa d<ؗp9VSGox#pt|E_= LY>\t 8x.OCh@@N[ yrq\Í+۪bނZmMλ9f伛n9{aY lDp6ڣ'_Z7U77)[ھ^+ӋdxZ[g*aqፋ9[ZqIɌE벰LރF#}W%L!;:M&[M+6t"gUeU)uE@Rj.nXC$Փ{BTůɓv4+(5ym$=qeτ-RMVx_ˡJŹq(ye+[ܦIŐy]{28>>^ydX'`_ 2Rc$qlTX{[IPlGZkl-Dil}$] &\e$lPŒvJ7#UpB",܋xq8#7k2PJ*"F[cw+:HBOv)J!RH./);zk]I?'O{BfF540Kt+}GM8,7$\n! [pHp8,&b?2z/v^&H5 ̕b}\5uPeRj$?Afb~^Ƥ-c∰"Q⥔: $j$RSd?H8 P<6RW}6w0ԕ^f $?lQ}#dVL,rK̠TxHNq4>nukJ eZb'mM;,!K=bԣs 2?g};1uzոK{?p_3v6vOA[ &ogʧSsryf_M0 bE2$xMQQUՆUIa,De<9t. X!@A-YO\X XJ5gnӱp=]S4}dN@!%SC24O^%DKٻ}G!Se=/px2qH.:/ r帔ǫ .Ln8$jW>\3Lusa'%8OM#!'W|;TSb-0:tR>Q]{Hu>k!0sqJD [ku,Ey0#5SV;d\QҕN6`e<, 1TQI3*(ڥjHi46*!97)+F0w v S &*3fX+ъCv) bH$J@5tFǬ[[&(Tn~9smC9jt!v3aCQNDޖ}`G;gYg,]KSI+;R{66=>.3}DZulz z@ AE*+<ZZ%GE1m S5 jG`$SI-ט.J_/>J-AjE0(AH Y5!VhFiꇂҡ+kULd(H6nhMpwql5j+'ↂJBzc~ѧ*HZĵI]v9~5U*ڢG!^ij37ԝk͛Ml G<6V?X95V.gӘmd#1Ʉqti!n%Kʥ4:H&5*\וA*Bok.ߢ:)Tyw ӮúmJ٤]EÚRѭ~d]ɵWoA&Sqԥĵ% ٨VpSԠq5,{x:s-M0 2*>eH"E>|WF7ńU5$vT$vw;>ÔAGT(/)*5kf ףޜꈖqJ8&8}qBНР͑T5.w IbQ[&̶rn:*@Q j) ^T*H3T>EowR1#+ԕ%>eVqRx*d׼V+URU/-QWw[vmCՉZ@AbS`1$}`AQce{.$k}.Pՙ֘?\&Euu$hF|4*mtvUU](%Z.S ,RR̨豕R{[BwebeQk)(5`ٛ/GG-DHv Aq mjջj:l c[6I@a:b[JNSp1D^Hpz=is'ޫ+Lb5Uvck1RU L(5^$dOxt@λ+D? i> n|^rNڹ,7% |}Ri:a_ @ T0R̹J.\Jࣸ㣮\d' ǞGe_9u/uUw~L!"||-$LWB’ -`roL\:Or˱8hv>T2)_33u ,я҃Rd`Ƀ17X>rHp`a<:u`s JLoBl=K-eG̾}Ӗ0b9PxfC06>jZlP][3!.f1]|I{LSҢb<.3~q{?lϐOnSȹx6=C$Z'1HnҾʳ'K"kO}@;E_[ bc%\aƫ3^]ohƫO :L^xcT3\fG1? 0eu9R+53qݪy3{Nы$\1b+&{ua4ӟ 6Z>vppqȟ~4|Ɯ{%}C{i,ўd/1E$d=$Jꥻl\G{1bczj݃X;O» m^ԙ"{&uq&/?yO3yޓJU 7:+e&/S#/o=[a2O; ޴jF*m ؎N0uMw<1އj|w0c56ٚݿFV|ڵzhK袾Z1I墻 J7nPI.*РQJ/  @DN%s8 nPsU{v? }GB!br?\q!^&~yyj LcܞhrմdŖj(&x 6)fth`ck[Vk!U%Lauj.S,绿xfaB/%SV @=s8U%D($R9#q)K/Ȝ}=zUտ50f,SI,p!amuwKһwā0ElbVOeqQ!<؇q¥Óߗ 0"%xV KJb:đc[IvWz LFOSP='ˆu؞,i?{ G߃/֫d{62zml!Cn~W[\NJr-Mp|/"}>>=0G'3SSSf%!:%j`9G!&vs!YL؜zhWԲ]^Pg #6ΫsW,ӄ91] s.l+PV~|"lG#uŧD1zjM${՝=}fYƊL`D4)`Tlm05%~  "+͘ksЂ\CZD}b#$<Эsi{Qk [veƠ ^$9p{'XNo,MP: I{g9[=`?I`CuVFF#" ^iKiT:`ӽ6I*Tjn<{#`$pQSomr~}_nٺpe`fSgG eϼad|W8RXL~$  *ٲ[]eqСPSȪAg| qq@hCz%6}:&SsRH~k:RZT1͏_Xxя_ws4./*WZ4ZҪפHBq89Ύn?.W<up1EfhÏyhG5#=(.`:P$^,w0r-òmlx¦s6ӳîx&f2Yo%kkY^ _ qŹ7֖Z֜dB=@wge¤S }P<4cYuD:Ї`^|4a,#Z9zQ%.*ԟs2a Tl6 {PA@ɂx;R'AtbXjVn vbVe ZQ#Pqsgyk?A L}U0|,=]X>xKlu:dq1V>r֞CeV2:{6u^4 I8԰n\~%a&Y#(g'lZC?qrsLãL9iAz:d `_7V~;c[=6z @m*%%fh1ZPEC]e7ZBWk]o($K3Ǽ{#rTo!j!tnaLԵ؄cwyRu=R'e/:$+0EiD;,uܶÊ4*ij !\tZ@{ZWJ &0Pyר;Be7v}YsUhk%@D,x{PȽThaY΍r(l<|7}zA{tӖ8u֑x6@lyxu`Y7l (/qz' G>fU/*OL8$YgCOC嬃q@mf68x#i :p4N-ۏ˃\F:t`RE-_z˖1[+[Kո="hKL?E7՟:|@Շ3;TvVANC͖EWm0l{aV k+D6pYCZ#n9e+m~Ssݜ={%ɟl¤m3o}f\z Ȳ'ʲZiRWH)a%$Alu]`z![ u% Ҡޕ g8md%֨u 8|׳SR-R{Νf+aȌx?$O;fpZ)la0eON5؝o-4|riKyyOs|g3OEI뺠oX}-" +~oEw!hnɯ_dN|;nmڏd5wLXuzhUq?,~\gK޷XfM xк;`R]1Krʃ|ڪFEbv?rܒ 8ӕAA(-v)8RlWkA; 1&:;Ϳ^㺛H?g䵥4D.*ez,"}:ɇ/uF] oc%* pv8Y(% |=D\y8W|~sb#[qW$'T&֥x'ĞQSH9d.^SɉSA6WGZ+nQe_6VXմ͝ v2E0-CJ֋I+Y*R"o-CW^nǀv |-IŋTA;$4FG=hX dž͂%DZh̘t—J[rՀ]ϡzF @ *piU; 6 ܼiDXY|'3і<2JihD|<0(:{M.Q=6jlu0<}{KRӊCe x}P4.Au֓QVLͬo {jUFn*a4JM:ުm$;t5~P>u<~CТ Ry5ZͰmy[d;yxmJV-Oҡ;/)* p=j;Ol [XGj^}rP f4T~῾?^Z[w-9'Wem>Ltmȴ_\޸5 ]̓w`)u$J娭qLctBgA^gd H RDu;ϲ* &k"3{2 jmx7=h{Y䆡KR:+cUQiB\ BEDe,xoV ť=(1bkI$!؍8)QCz81XToZuzc '&Q4:1CW]b %\S}T@dY.>e(f ɺ\WM;Lr?{udw_3YM9;.>jC9:V@{Yo ׂr*)[DW{95yOE-HtwE Te`mIvf0{!Y+ AlYGKh39kevbk1738UMq{)a|jBD}$q=gY˾SY=*EKp "6f؃_3Zh\* 7_|b|qG;h\42 eǭh8x%ike+'=xб~w.1mXc:tg(RqܐEb/Ycg?kZo3X9vC(Db4k;,_=)~?oTƏ(Huy}9ͧ8hk&K{h]5o@/RgԐg@gkϽ_nN Q +&.:sUMdϜMxS]'_,D Pw|ͨ]=gd|M7,2JM+c5@~=̆G9'm2X!I&;@fl Q&)6/ͦfW{}pj&Щz<{Jbٮgb:Z lp9Pa(f Ĥ&nfDgfo633T31 _JS?P*MؐD=:ff߻-YK DkHJC-V)5G sy%Ur,Mth S 0d hU4m3 L_ͭEo._b]=;`[Q-i^]8DRW3oEe.@cPʸa,Vtp5fTH> }:0,23S7̬CI̬x` PD:B ;u.2H=7ˀHitD{ցv3 '›tX㹧hA.u"J@;3eMp&!;bG*{jWerҘ_k/dcEaϠqTy65zz\q\lh\B@QT]D%Sbj{"D[uf C%Pxbg~Yf0rM`Ee.?/{ڌz3w|n=6&!OoF~ qn-$QB# f),pS̐8OЦj<?-,p+n~bn=N_ B oi("*5)wy+" FxRBQ9Q"n`)F[f<:/~!Jv\VY׹5qZ~nrnYRx#xbWkV5NAX6* H( t&-\)Dk%:BB3#kabd".Rc) rQՖ M!MfBϹU(Ea#$8G3A+tZsRVZ4V;R08E"۸qhCe,fQ"'00o5Њ&q;;GԈ4Scp_hy sRxR: J/~! ,>u?m1Pԉ(X`o*~6EPJ[p4Lp6IRMw-|Z<[6oOKo/7 Poɷ,W',~gM7K&VqjZ hʛ>|Bt~?=i[{[ 'ݔ>@]PL( sm+`~Lhp+sMTOǩs!E8~q>L)?~)at""o+FX !(n`b H|[T/Ojz$A&-T!t-֥u 9M`'3q"4JyDm6{1)$=/^#=5|u%Wu/!o]7} $7~ ِ#[I*NА#uXd%;L.ar.j 1*a VW AG=3{}UI+ji2K^\χp237P(1 Iÿ+Ur^= N+ oJFww?`OW`ftViɗ5ݝxY<9G| _S#`o.aˁDm_^9POUu2ّEGdەcJ5lۏB,"=1b -'[9j#0s ۻ8yS6f̄~!X2g[ 92:z;Wɀh3A3.Y;$JE X(A:XE 0τ VJ'G4T =BGԀeVz[Uc`\b_[]D‰u EK|; }0MYʔBqXtpWYNYXm ~\ }rbJQJY `BX8Afxg )Ϊs0rNJnT-YQks5MY7TmYυ,3]6=ҫ ]Fh4ɿ z"*d3icA&zGŞ5v_FyMRBMU%j}kGIaAu˳Wt(vp2ZNlU٠Jr둸^d5j$kw*NF;s0#T:%1%)%Ϝ`3P'RUBح,c#0A Ea.7"iaEwc=AEDTPŐ& BP.~0gG/uaIW&]aXꯎR*8gDR}:GByM6oeR3kh4r沩B1ո ;eovNU ;L؉ ʰGYS\YcLӭXITcL\3 դ0*oqeSx+R0z|x6>ʁ-F5jQgZF"n֔9go&$8ݴ5+jh=lRdǫ Q$"U]/vUy/%pLбQ(@,DD:a` .!yuOq2!2Uu GQ,= u(J)PN) ÖZ\l A|^J+m݃Ugm :Z?TFl>*ЌQjbn#QV| 駩PRz5DXяI |@0paMF ۈZaJ| qc˒Œ תBn0iÿ9ۏJ o.)Lp"Rn"2ڹieHI,rB $b†[nў_ O\)ۚk tpNp%cL=N J +O Óm`y,%]BQ$r$ Ю&Vpx`!ƒ{.QfRR8!"tEB9|9w j\r-1b.cʸWn@.=Bj]v^ƪȽ=ZǕU 쎣X m|@UT|*|Q>C؝/a٦@7Q t8 d:ʢ~aizdxIМz ޘoTl/`L OZ Ca֐@K]`GP9jM9fQPp]MmغX;kO]w?m\6 tm &mglt45vX>' @߀!!w˫6 ΂/^$G`=")I̐5e>ӫ>{kþx`F72 DRs__퇯QV)[^q[؉.0'f zkZC@Z`BPi!"Zȟ*}"h`h]V(==~1M'wqx?4МXvd!q@*Iq&3M;s][sŪ/pC*ބ{Zlnp&/^reWe3,fB^kBW%ʼḝ ݇Ei5Si͵TC'&qycdƓlͰپ56l IAR~,CkYgcU'"OFӎ1u@l=/xyɵ\YI;\六jkÑ.]'r4BuΛ^+k圙a\ҖwhgJ/ !L3?1_9k3 y͠;k;pk疵[jD~ H@ҘdO- 8DƦ.I~gۙeNvOX]ɫդGY("y;Icq|:/3o}fOL:woaRo~wKlYtᦳI:J;G ߧ#|l׃]'1?I ΂ 'pd@{g;=:Lz{rϯN7l9s<Ze<ˍI>s1ޠRwИjb.~~?6O=\_dl/ΏAz4XmOEtfڻΫs%{ OKsFT핃oW`o[Wes?nqLY ^M1BKxAoi3¡_ᚙN*'2θ+ÛQz:gSҤ BieRL Avո?}?} Y8}@ݻ߁hO&$»H<5+hѰbx5y=HtYTvhcc&ݿf,Lb_3~3{hX(5XƟt3f4*I"x1m ⑏-b&0i97vְ@1.ak(a3VORKQp>assOԁq2pmSk%e ӽb$ƑA 4JqJ sJDž )Y8%xgᓔaPzoIto-ɣ%%ًhI-ɣ%Y)M>M(K6bڱHIM#(Ƙ{a$ǒ)Y22O!]_ocBc-kjdNXFBeߌe6Do1˥,sB.eWr)..p!_YCԡ#1pcU>!4ӛAމ!yZ 앫?c.t{6m]]>/]-6]-u( x跍8Hۃ*CrRB;0i6lQ;x!h 0Zt/bʌT#7^I\ 5[JEJUdPj?PhfEBPD9Gƚ%Th1wX"Cx,lQdAQ *~^u] bL`y/vAi:~Ja] h` "zh-"0Hj  R k0%+d"@WŨVTW$HDa&PLL\uŸDx:^u:!BiD%-p0Ɣ*\F2BU.} BVytl۞ths7M:tȚJG0Dp`,MdC8FZ2E K 9uHN-;1F1!pYp&(A%+ J1CFW"CCXd(20:9eP` pRHP͝u`d)0 1R)ccm.hQ 8nb`(U_o)1G 2=Rh L!pؖnQκHVJ*-B1⨴8`ѕiB+WS \WH1,:H "/OipYA-7[e0~p>`2C9.V2A)U* f_+F`*L'!B5,dͫ2̀]$C d$5NSɹ\{g914fFO-{Q]tp@+:LfrNYTa9)X#Jc]]p] 8.,r^Żi|373َCK^ۤy)P.AaT,TwBnj&Rl}3XKa#mCO|eR.k1;腗4H^dM}'̀K"0Mb#&]^i͕ͦYwa/`ܝy6dҽahK{Np@aK9]ElGg8+Bt|"(gkh ]3 a  FkO eYEYA5 +MAhYLOz% Pg!w݄I-n鹿QVlhlQidMˎ޵Fr"Kf\* g '0bO^lu񮴑F#=I鑪&i[,"Y5ﶰ$FE|V2fl_E@ű4=?~ Ov[Jz\N`Waiyuͮ=)p,G^|]-FW A@ݪVT :ՀdJyiڝR`J!!Y2ܥc0`myRt+(&j:Rn'YJDufB'v#7Xn JQXUSE`@a $.#)LV,dڵbBOva2J#E.Dcϟ|.l6jwC~[ w_>픜l-e..b}| qg΢:F,ѺxJZ@*Ϊ` J<`3L2.*GSx!RV{Ф>{$uJ}{܃DINOmu$t*"n1ݣf͓G趧jB i SHcBjZiC!hoj:UIf`"HGd 1/XEs2aFffN>\?Ϝr gBwMHjo)y-:߲\ǯ1݌)1ҶcW]R;$mtfÐQ.P5k LJÐaϤ+l,Zd"@$k!:.,X"ZOG9Pb`j05=`H(Y RYThQDQs(a1~).w9n?_fS%=a#X2%~:xrYGpK?q;D"-κ"!O4%KZnQ I(ӓKFIČka(卼`cQy,Ƽ֡{xx5YV\\t-bGW(QI"y`J8Cw,PP/}QP,(1I,fQ3xa]fbTKlz+Ἐ#2"g%ܕn0}X3m͂Վj#TOp 506_& >B?'ԍԢ㮪P k+ 6JKDSkjWޥ[,jdS^h:20'zqp$aF1=vLfH?L=>@ajiƘF_؆Wa9a$uNIi_h!&_C߰iT4©POp2B2ܙH$^ d8IcXVHWGq!R%iVfq<$fE(]"S5Ve&1.sJMǸt̴|maefHBtŢv2\ɥMPD0p,%c9 bxLA/BbyMJ#JH9*V__p4f1uH |6,Fg1?Y30.XJ@xS ]ۺb]Z᳆%\ieC ?T6uMD чWX)ٝeNjM.< JBd*s i*j]e5U>⟐ϝW?I[i )Ԭ֔4"pp'T"C@y lT桛P- [h5cn&?w`LC(<:rs-v\Rx.~0W JY#n+ ,3DuxoDI4!J4*׾TzM,ZXF-Tuh 1Ѯ#clVgy!hǁX dO;. њV̙a*Zk@( xP1M[d"r-Hɰzy xՇ͹0z }o-X0d`17zI_4FƗUq>J*J@j!tX +\C`UuČ0ALr^1WVZHA_ mNRlI{B%LÆh \_] pbeЯ^czTUExJk% q<9I<*c0Z g盋c]R$=F,P#Mf4vAVɰ\}rk\9.esR K"(UUF79 a:&S3)5*f[T"Ѐr+dhJqymŸb֠gP֡ͯ$AqqQyI-PةƟ*vϩYgW{Fu-3I TBi*pQ[4rdEQŸ{q@9Pr̡s1qR-k5dmC.i-(a7B%qw]Z)͸bu]",.].b/D=x9XĨc lIJ1H#uzԂ cQ(ty}.N4ˬqc_Z0%dfQ^ 󖋇\I8W'^&ۀIj T=LVYepFL ̚! U+ua`tܟKխQ's=;1oLbLq]*C)z~JRRL^:†Dѧ|mRN&k4)=I9 d07O&7r }`lT;LBz}Q{Й4Ңv%~R&cLJE8_+lNc4W\8hf'rȄ|osT&P58Ƥ~h*#.;ˈsYȑUFz z4%v{SeyRWkXWO4a ^!QZ1裔7tA-^ ivXFs jv=r>`:Pj4 *ĜhhN\2Y| ‘B6Ք,cE*S-cbYNDmY*;Y1=1,,_$T,9/7o Z ע%Rք((ȼ@A2!W3Mp=u%ԡ,qF5 s]@5DaH5dA)R"oCJ2Y_~B'Ch+W"2|'ɡu9S,1sdb1-x #<H,T.! al i 'CW;'!!s }Fi ޗf2jNO5^.x Kt h$TR eE(и*^;􀫙`+ٔb}6 fPT{o8K@:b fZ*^ڬ?¡g%.H@EBD+2A GoY bs]&F-W!(f$kooj2Aq>CP,AeWp~y.?}*{u1E׍q[<DXq H-l4-L,ɸFOZX#{a 1 `adJ ^pC"ld=MQࣞ ݰQhT0P8k܁DYIE*p*JBrwQTp~)>yYV*j%҆ i]wI_[d!]03 $?fa4ͱRDyUDLR| ƶDU:p*N kcnVjߵ/yFK,) >WuX ʴLN*"8\(X4SE)&Ӕ hi$\aYjL# Pw9bQAj3~8q6RR. 2( !O`k xgm"%˕+04`U |Of;|4ĿOG3>T^4j\)B& WOϊ7zyp _Ao*eT¥c yq 7x9/m?1r_p2QyL",Kf&m 46" Q"0@a%f# CyPϘ"1Λ#뜑'Mg5Ε#9"3UZbc}zQHǏXU܀9}x8K2Zfvr.M/*3LauT%* ]]`﷐5 8<F$ 2t\|p_懓J\vo끻1܂_jkBfA<% C;wpڔ2{'͚RϢݯ @ހ4%I#_̆LK=$>М*J3>!wlId@›ܙ`|\LnU?ng_ddy`/1u몈pw\)#r ?/q>N$U_;ˏύ[|/Ҝa(<  uc!0|i7鏟\.t_>㫙XKC\-bv Z9?~;qSs /U/;-> nMWө]nc3A7NwF>(Հ7p16$R8"%QȤ?o9fX4Vcp *Lӊ4ٺ5 tu!AaHV79xus̊`Z˟'@1Bd=1z%}7LTJ}1o0C_hlqkd,'Ey-&.(I&xCg?,)yd;uή2+ۘ dڿ] G`%x1MG|l2]%y/+DlAnb{S9`Q{>*~`9_=ዌMe>4̮TFl@S"#IH83KYPčuH%f5HD1yd#BXpLwm$W  DE\ F)Hiw&(s~+%)b{y!k6Ry[ q1iOjImXq-?o'ϫfiCP&:ԜE[48biG*)%D2E rCGE#T;#B,hm*.Er jp {~N1EQ9"L^019' gӔ{bj8y˦拙/黾bn۫_d^s$cp$t.|Mh1|s~9@N5D3 fr?Jvr/; 惌dFbW+w Vpr2fA~)M^2M_4ZvWTM+j)UѾQ#ˬ\<UsEfځSxEn)v_4qPQ!lix)jbytUbʪf{l_Tp󋵗*/Uř)^|{ٰ.MW9oAcq^N:2^^Ӻj6~@iΪ'y^wx5 bTWV;3)r LhM^ӄ32!Zt/5 Aj۵ [2lsʁygd@{U}W`BsϬdȻ%-{iS<1wFft TZܐ;L^xl nArBg."=$ I{V;ZV"j-Gy\I&Pnr|wBFLc?|*%1$!04_`I&3p1Q(vRP͜ 6pS 0 2pQQx#w:x4JjN=yr!~7[x[ wR!ls0ET!WJ19ZJq$1={p@^xh`qgDWPXn+1 ~<"}TIX~<"Urڛ{-^WVsz]ubIfN5Qilkv; ph;BCaC=CYYDوtZIp^S*n gj| d=RVj:'Э"Wv1Is ^~-M\Y!6pMӍ[e ީwmjHzNA668Lx;8%Qc}+xny5;!r{L_}IWM[yD[†ʕPp$S5g ڭ*EDjeGϩi-ꐐ.UdJd tuOݔV"S2|Lϩ֭i-ꐐ.UdJcyŞv̇ܙVĠ_R;S.ۙVhi ELoǹh7lmng[=VyE[EL1:.d_ ݪbPDtQF *vCB^,S^K~Nĉ^Ls?wxlu \*q$s{\Mv^;hQp1.Jm&T}2zw6 0 ˙{se)6Z.o7j|j #5`k37M%HAiFAF#"ǝ!-w S 2d׉W鈯Ƽ8'1ž:d_WRu UBv2E\S* I`Z*u5i7 -{bHזY?"o|qCZr{#Q ,({rs "rTu-t~2lhUG؏wD",iːs0IBNs@D+\&v5K 4 \£>%FImVp:I?# OFL!MVFil[%=,DN6"o7i$k8| a~Y.,v BK kW޸Xi3Y];V䪩Ns [" ǂa1 5oh˄LƠ&|14覩밝rN#ZM\n-BE.n/m)!-=-oL}Z2:7>yWtw5Z5;!vs ћ7g[hI,Dδ? yB=mg27waKME;+:)`53σIq}b:PL _&댅ءVuP~]puO>JU>܍jOaTc:9twzK7vڗ'+Ui+lWq3la=jlƻ>,L0:6kj ;ж@ZHu:u殱7B1s:Q%bkfcQb$QSh5rԨQlPкu(QhB;$~5Qp0mB(Zԯ(^I[!w͢>\q%krҥ%q%Tβu|i9_:V" U C0 A O9E,$N)9 5 8d'¯,Uz3i̐>SIJ1X_$OIOT# 䌳DJ&e?K=KΜWRk}NH}ϪJ^RuηN3+ju*AB^n SD3'*N~n 1}х g%w.WdQeos}NkR@Q9u(*˔z({c RPjpʿ2GC06'Ymx sqm³]KWxKtV%)a {`Wt0LV L$t,uj&h:#Jީt3/fl/۫/8b|7uqa¿?&W-O|竅=Uo'y\wOK忆MM΁8Y߾]#Gư'BgE3bp"|r!?M? {<XJ"vɷ~^mנ@{!4f{)J|qSU2K8D]fc v dkרt,>bx& K`i{ %2{b2,Iv-VFW%C~]$ŢS\IA0W ~׸_=,iAO`Rג-,Ő$z30 D_i +5!][ Vf:l_WCw:4+ʌ 3zB^|z%FњQ#X1MTۛ2 ܗUM"bF&ϋ= ''Ƚ|EfI:h֖d0L(h$(* gN:NV2NlY]ѱAD<0($9NϤT6qMB;b<䅯e-13Ը0dSRDs6(b& d$Bieihlhdh$02 [D>_~%q}!Vq.fyǕQ|X<+>6j₀SFB."E-4L)!Q@ G!+grItmrֺhDomb> >RIzD> aXn&I^ML,?%|'Wˮъ#{鼎fF凹(D\*N]]@*ej VA agqi8"0ҨX  4eu bʹ59McJ3;\il` w\q}`ݮk^IwYIkKju;X={p.8#@X)Ԧ`-A5Rs/T`{TAyW JIq."8`HN%`8&$"*`\H)A.E3>f#0%Lh0dad oA޳6m$WPrۀCUt[K.+~TxQM4@ږ_CA SI$AkmP< xDD(XA|!:2R'0 8 'ek&isKń#., !aC|`Fa "PGX$" BJbS$2{S 3‘LjB>u敄.:>}'x@F%8R0JBbMwAИ װ8G<(]7D(A Wc?8AfE,<(R!H6#߶XxE+d0f~^L*`5? p^z~XE^Y&Y\c" F2 #R^+!|^)ГГ o#";*5/|q; mIiytQgn PQ~rPcN6[-s捻ZI9LϦt=>W D1>LjpyVJ`x N͗ȜQ-_->5&Ef.CmC A-VKhZV3&[XM>B-&fW{aVo8T hA5xDliOtpK6AJ0[{eJ4+aoa9| Jb/Hc홆X33 %Hŭ&oY`X*" WDžTYV淑3:7i/LuF1&W={D^LrijlQࢷ¨ cN6#k@}m^ F7l栘H쭍T0%SZKEŇn%-'V;QJݴxWoVM +;,ޅ(`}$fcmw!Ͷ{_4'a!$f ~IԜ +Y S:ki,pַmo\&b؉}LQ%b\i4k ҅`41Ejݬit!'ٷ_[3vٶVɷeD9JdB*i7.Ma s0p jm+ldˮ2X!Iwn~I>Oӯ 0Lh1{g D$vpxۀNJtwzH07EW\8*{8[+J?p|$b0}&Gvtgj@i@n7@i*\UL/)6}<%_ч((i8iȵ Q=ohHb,9> eF3"+ɤK[Pjwx\#ֱ)6oW GH< qNXܡ2;2Q&̣FX C.$A&eo]՗>  D=vuL 0qY`e<(ؤlYCܥ*bOFA,>˓&ξ6ؕӓ &p+kNXm P6Ay!{$wq*#F0xO` u6@ZPc[)<+'EEXyy8IeaE߾q k'co0Ki) [9ͣӽUe㨍_$ݞ{ ?'&r݊/}>>WRj  jGd.l/sC%*/E끿j/ѹe܄CKV]!U4I1]OgE]aahDv3 FOVnpx׺ F&Rgͫb^}.}0F1 ĊXQ%>LMA @eCkf̰.ȌcUhh=\mʒ)/  ` `sO_ql:ЛϱivxQzҸ3);T/Mx4ٻi~86K\€ahnZyabfS]b`rMqqyTM!Hy Dn@?@z;u g7Ns#€bĻH)@O>xbJyiN;b@=IB>`::[-Ta0:zq= /0^0k6]?a:6 l@8`M0,9B PRFf2/u5fߓl8x\4ߋJw^՝,~MGtiF/bC09#tP,~WDJqфϿSk@|eV3<;{/Pw,t'c 0[䯖c+Y1ݯe/B󵲘#.ޏsX_~XjD>o^jqw`7ղKuQdN~d'ؘI\[j'?tE2THtnzkeFƒt쮬22:f^-CYXcwR~jDёY^;_[^-| 6RӍYTu2T&*n5>Z[&:x#e}ؕ9X2+cX֜sGow,GoPpӎ~MN~.;tI\Ɯ?=E.|QЮ[dW/O (ﻙ$Qɇ2YFI{?;W0-oWa{0fԱߵ.w/SةCH%|uLzAdm*ŀGQ Ǟܹ9w gV5[23.8ڵ  U;DU:vƩgm`> ^r-LhHݜk苝5Trvcu/uz23}GYZ0NۏQϣˣIZ L d*ʕ({qbWnI*5 Tv뺎Tδ!eёwNdگc &EX~M%F^$9G^t7RGO?e^dL[sk2unMέSMeñOt0b ̰T jh@T2s.XD%:Ni_?޵3]MU=ӽ~{m6"״xz!w7~DkՉsUzٳ9"#W<q!/jp$>$vGU^+0I 3c:Z$S$0c2]_KAbLr~9id0`VvԊuũm]ͫqPu8߽zF{?/j"'Wy/`(Ԡ Lk<0TK<4 Q,.$F`c\ t?*_Af3BP0䇞W&{L^9o+~I*{ayk%B.j쁕ж@noF,0Q!dZ1 HjʰUs]/T/o^T@Jdj1\_5$/f |\"GįH o=X_Uzl4uwуO_!t^kL|θ })7>bpw,TS8J_M90i(YŖX`,ID42- 1䥱mQd~MBzE׊Vzmw*ꝍx-"ʀd(_k-9-.zw p$la$JOM@ETHY,K W=}ۘ'y 0GL(>#(٬)ղ=lso~ih I ܓ)+gbv.ik`s_{C4M7lmqq#ٿ"Or_Շ/]/;$쇝AI[;F\&iHy41LlI뾺1Ϣ? Ɠ27f}{Wf\=׋ mq0Y?|~suFuul?Dn(R|gهӋfyae~,bȦ Γa8ןnې]]iGp{b<뱙/̧-/LrS[=a%^m&79̬1x{hzPI`r x9/.)|f щCʍq0 ڨ&our3q;kŃSdvن **Y/r<{y>7GrF ~yw*ssUffkYІ/8^jrKFn 5PPzHOϱT#O u9<-a-e{-=b\z4 9JyP_tpo7`va$_S' JzvT/$Glff U!J9 RQ!W'g~IɫH 8x{&,|Tm+K PGD#p#4 S ďu:ZP! *m'iˣ%`o[Ņ%+ܱ ᧂK_N:x3TL sÛ9dP)!7H`LR@Z$P!2JP' UhK?PruZPsԴ:ӖP9ʖtVNZ*5|箆0J4vsL!Oq.;~VM￐- b"$)g>8}nFѰr3m MI{2Hk. "_ IWlzPX͑SXnƕl$EyPw(ҫ/,6e deCe. mگگMMZ+/Lͻ|'`xlXh0;7>peD OO6L .o;p7#l+\aS (ʃqyw>b{kAwS$0B=sUq>&Q 2XH}gy<.j,$Fʧր[b`Y ̱kT{XwdPG|1򭾄҆3='{.hzǢXNp>$QDݲZlN|~3;{''p=OmXgP{O۫waThk* ƨ򼺴8'6c,,H'AI91 2lTH$=)H7(?}sXJPă36%U6lv]s`տ'-Pɻ̮ZgWۤ)_GaLs=J6Lv:rPAh ~olArR@=Rɡp+HxX~msL$q[Glm9pi7UGqN ʩd;BG YwRrPEE눵]5\UsU݁$e *oHO]eZ.O]ψkbBP{*u>0 u ZH3RO^O,w Oųhi `6;zȾl)9ު@XzEiOROgi06_JFyg:F-V_5WQ>NϏfu0dѿIQ(a6H\IZ4!ksIg? W~fr{$IStZN+j >% >huqWpO ;CιBx~h 8 c$,>4ŒڇQ)CE\Q.91Ba"e qewzYe SJ[!dkU0uGq%b=7;9D`)Jv*×ʂ/wkm^̩ɡh D+ 9|@9ZwC{u8m kUnt0~rZ;^\,? ˕MZ26 P~eBٖȘi-sDd AxZѴw#p;\`[=0ф$g]Q]{ FX*iu3Tť4'L:Yv-ֳȭQ]AD1/ޫq`;jň1@P #/ ͨMʘq1 iP3IC!""qC,5:ZEgYum s̭ٙ^(Gw)"_gT1Ps{âiZXNf=7JFGTuzY6zDHuVysu3`2XG'R53nv kެ#Kc/ùtۆȢг#/FNR\V^-*G)ɞb.7qĹ"c@lE&}H~eXv[6uV05FE}i`YFn'%VP#տZqRpc"i|eF"W?L4 F 1_ e|12)PŠJPHFq#+9x87rD*Č!P0$d ib!{!,1A`beA!by:EƜh襤$U]W1 x(Ea(bőͯXuV":b14F*\Jt$c]\&TTX`,94(5 쇉TE62 T88Iҙ)L$z<`4A֭73="r f|w13);@Z #c`<-2~yg.XA:sӽG6@O6xA}қf7W?6Vmׅn/k]lp{KɼP ZQ]-u]OMŵi>~#Q]@T%ވNgD=Kay&ڴ%Xo3.Yw+8^4 /N\BKZIb]Eެmn}B3.DH!/)Fs"syvU{CR7}m]ul/-UX[jޥ'yA~)`l>!vV-{8tǜΎ%nn4q6;ds?eoWu&v,,+nU}w}n1TneOs$%N*VYT)^T|:JU%Eɧg*GT阣 84އz׿?ų|3P Å% }bPaiۙ"6yff]fbgOD~|=G_ued"CsyMK"D8kU+q$o/!~^W5}O}aNft8C+I:h4~Kջ٧}:vx%s#"d&f`/ݺ j& T_kEnh9ݛqSX%k'p(m h"$Bh!p*@`!RMRlG`Zh5Z.Z@-0Ҿ +w (5uhES%B&TƎLXIfA#a~pch@3TŒщh* {e.~fӦ=ΗGPǀ-$pכU6֟ U۹trcdC/k&_Ehhg-l1!v[a_ ;>#"tc\U1Ce[nBv * {a $T:c>Ny+錑'IjVfI7.dp B-څ7q(ܦu!5D8bbEo?~~Df}FxNm۾}b'v!!߸6)e^}dL@3gEF AeE!!ݸJF*p-`)i/4c a@;F,2`eoƳq9f(lQI]-i[3S7X ovIDݑ͆Ѻ>IZc%g v7GJaTzC>.:ZH[9I}\wUq1tEmCQv4m@6Y;2^jWh0о/,mԒʾ4 ] 4OP39N[Pȳ\kw?Ŵy؁WXS +} m,%X 0SHKekaZȊw^Jm ^ĵ 2 TNXDC=^O?l~uvE)AqȆq2¡DEN$+BqyD e?^5g"BGz)bC2:I"S +69ku.+e9S FcDN,x7[#oS8d߷CF.n^dWvNX2Mˍqbԯ_ cV|p܉2(9^ t$0v*Dzzh4k Īg Q=%ԅïzY11Oy.UrX|c.闑LEH/ rMTkG gp8yO{ [o7}t~,Ww̨笪smR֦Rz$0H-YsE(ƺfsMScr~^=ׇr$:sE?|ՙ)0"Flݑ"HoM#U[/d dq S¼:l9|^;7hkPh5GWT/ SI\huսVZݻͿ ZC 1,IXK0e*u"*e"Jǜ"X1dͯ)n7vNEgo ĠnF*fȪ{ᶝn.eG0wkXRB&q`,!*wޣez9_6)vi@OwYذ=4X#M"vQ"Y%flTʈ/2̬n}s*I } ֭-S]xi@.`6t qPkeJ(8iF'9cbVtd\KI--KgqWIǒPѨc)ؖ1BכW3J\^?Tw4"뱯V!.םpz |eyo18~xF*>>+qȇ~ջw32{!US0ڽho`WVScYMWa^$ZNVW`>ɤi3qlDȮì H;EmR^8Ue L=֍1XЁW-ʒqorf,@S%ũrR?Qu]OWQU\=]}7 ,CM~wϹaK"xo]`3E8P9kSaūJ_ O__c%\?.}oX"gv'||wwx}5!\PbO_36@b7w|Qc=@TW$sY[9+ 9TgJVxWSsd ]FkE9Wp U0ʆY*ˑF+zfM13>Ha[s2@8v7 F`QLBčTqt@L^55ߓR|eT O1:B8Y`ǻbY}P+}j{vlU 1QT 4c+>og[W]d*YԻN&; ؛-*{ F+UjVc㽮cdثo^M',>&H,^xw)P/7C7+J?Zg{,Mꗆfhe:0KkRNQLwLqx"IU)H ERz)qb/}5jvSO -EaXyM^3%X#f/íPð|vg{ 3oH'%x>ql~-"h,4~[҅"2SkG롙"2WX3ez[c&vw@]4*9htKBtJTj ,e`Ù*Mj"P*4r:VqRV7?d % &.:o(nNZĄLHg5ń<(ou-,5!n{^D1ٌ~Jj?n~ iwZ%` z3 ~4<)ڇo>v|'[EM|a߀T뒿lb5̮nGϮ_ +{s$McJ? h\zJc{sn(lr5)KF&@7j&TqZXx0Z #DM+Ȳ^BDk*}.3j)ԡ%|҄zift=G [7!W>}3 ϖgvIxf6<7`b,v|H!L$̸ %=P\rσ ^p%Mm4/m1}ꗜ"+j-/p2ΤͮtVަT6}oWD1.dP NyjOSW) uhW! h(q)-g;a H)t\t0\w*KZgT49Oe ( "`.Ғ*|SxXߨDŽ,&.0{!g !9kv3){UaoW_0Hgj/Xk`֭xPD$qAo>VZ?ϗ8?;O)'+Wey+Nwr Z3=%I(ǂ3#(]!% $jR]RN V}BCeC̗5:T,, 'R]4]p.g 1=CtqfG!XI?`\I .(0l.Uh'$17E.%M7P0pxt$ cf( L^M1M9^rJD{ b(I%?`\)1P5{ pf 3$񡌺K絧7;H&$VIV,}}RX0>(y4@㤚"B킌`/"Tpc et].'y/"׫FcL2hMҙ)MH=7hx@o]܍9YJ߃RE)[NYKEW-\Q((- rP{`!%DO^y4ޕA` NyUz \Y4-4ʷ!30hJB #F A/bky[;IwVNؖ5%UM5ҎH}-eL5%ciiZQv_Xvwz_InyƤU`0~R:9֋$m~Ҋ6f2I4ECgeM~ VjBƣ-S\:}iZfDthyhZ޹ Gno'03 :T'4$oG;y|ΙL;ztdά8,kv:x3fƘTyTRG-Uasu@("ǧ,4Ry .p0%0-m %Hmj ,eɸ,*paPRqN(8:Fmb=u3KSvރuo4t3z]=遁.ZGG bF?bue.{eސS# _o.? E*a0YlYCA7Df< 4c#*3gH"Ӭ3UOf+]t3t.HهOvP:cp9`[QX$ܖ|F8p&M\^j^`h!P.ؠ!ЦӭmŦ<]~2,'),M% g?IIk>vF$ڏ^-~P:]hsLH6%!UCVDj"˗ aY;09/k=ĜARFkhI/2Rޛy]ӡbZ)HJPMF`ZzfN;vZX3JAAcvAoïqu>{j@CqLŢ5,46l4^.yS!C!Jj  b%'ʪH;Zg!*Dž@&Fc$ ]R)-5G /92vGڵ2,X/5O˜0F4V^̄< B І!D8LcR(!)Z̃r{G=/)b<Nj᤿pD$tL8P JCAqa&DY=y%Eug6g#40HsWsYsO~ېP?[G? ˖g/?c]1=aK 35f87x27e 0@2Mc 9 ٬&&c.r"83+h T[cMJFc-{%h +K $F0.JG^ ,azTWJzKcQ ,EQkJc)=q5f4rmY阇zj"RZ{th>}"K&JִHRݡv]aNM8bEVy+JƊ8LeFI;hF]Ԥ/׺wzDe&2/J4^gy$.oYdF12FLd=R= F@.E[?;sH>.B}&)\nΪr1@wcS\]saCOӲĢ);dFkF"QbF(7Nt>jFy*=vL)j^Λ:Ikv$"GU[J+ZXrb}]y*WÑU8_"oђ!M͵NYx0->_{Vݧf%QCsk\-xrJ񸰎w\XGGy}qa#'ˣv/:aL=Xb$> 21߸Zyw}`|oY-!0WR6g<)iwI'*I%)^M qjMj)8:!NRpkBD-e3F 8jtnJNO~Nk10SС 0JLH\2F$)TsR`%4>laޖzI9UY0Wi^L"QddT=?H!ߩU}8qd†xcGR4+WҁM:8e#"z  pQL}RI~t4>s R _Br44GDRqqz^(noaۍ ~-4];XNEb9%*K=R !bdF) P!3]8/ Bs&L`lϠ*w&aa`TN =9t1,mW-jNsd)\N) LVZgR"'7l)3PDyDz{7xnM/*/~=WoJRVuZaPLq:hπ*V<SQ߉RZM|zǵj%aU+dK%0׼ATefO@`qzLnKОS /G89W,ȜpY¥1&@a77bIn3\U1eNbDdc[i[\RjH%%Q(%NȖX!tJ1)!ghi ዋcGՊ) c{%th St[%Y6lc!kJ33cfYl.-tuGJ!Rb ܎j.@R&Wr* ȰĬKQhEXv&iAQJ8@zWlM5;OD\zT3^#q9PJ2 Iq縀*bZ032>:6gk`Ivp>4?ǚ*0 ymè֖;X[O?ZKܳ׺a ,Ջ'W|˛<11Q'Ffm}̷a 杻ub(Me #0O*52mCc Ùd̆_{ہ%_'ϯ\fZj)'_E!gqϪW2wuWo8X^Oyy//ߞ[mkb͘by}U:H(; ,IgRǮE28ܥPƱܼ4Xڲ[]Ǘˋ B5":~^52L.J5F8eB2fdF .3QDb΃Qe7רQsdT)cx2Zu^qwǩW^ikMffaH \$cFgh"5+yu2(LLb.>t%^#ςY?I1 qkG$ceB6]w$=9+2픸>o7I,8#oF(_^בLU=up=|?ßVٻP',S5tzr!2.!Qr`6k+_u  #ܽ$b5UXhJiQW I( Zr Dߙ  Ht{PS8'}T6|Ǔ3Fݚ澁G-caQclE!: .x~ W_084Famo!C.Ip|yӝ6_=S_)HS _=FMxG I n%SD_\Xp9=MyЮ6H/ EÓw%!,bhHQņ6H-Gj^uKIDkG.%#{6 gcrE:yfP]+9Q_޲~;n hL\ڷ`)dE#@ihe6bDωYʯ 5Zd`h lR=xoJ"_8+EV;ƍB2;sO*9BVv{?Fz7[n[h(:gULA[iQG&FCUH>IƱF7KEjAc -r#w//uu(BR!{8%HE,!zTҞ]b7 "5HSB< \"+ÀE")1R,ND1/bp[B@RCT鎥]%R-Zk-LѦ:.{\Z? s`nE@нba5{S!V*=L A# uzwd ^y3Q)6HN%O8VIJAZv(HiֺMpu(-~2%TtCξq.e{ϫID㉈kփ[wEę< ؟ժ Hδ*ՅoeYh]2^h&^':tb ޟ ?9P*ƙ7`@4`]/$.hX1*xڢ<T _ f`7!Hp( K\p׷p^.UX6l||AÃ-_$ :>e뾈_5]*Dac]im>Phry<7ԥ ؗس:h7gFFij;}3"}:>e ^FN^]^yq/`b#җ `!64o~xg^=H;$ғs)p>Y- :)J"7CJ&%"Ise2{g[i}kc'ʿr:?YaN&;1GQ m)4@({ʎE, +Dn7*ԄC%9H,£5++$kUlt1̡('pRd$EnF ޳=528NG.Wxi2Cr#7l s+ԩ޽P*H8d=䕔p~9$f4 ?^*[^ǦpJ:;  z4]\ )OG.YM΄D`d s,Gq+йd e^ՒAiO1Z)-A+DBct>3`B_m5P B ;vX/RY4\Pbb`T6,W/ʘSq@I޵ChV:Qf.vz Q`0O `UbHyEhícllF0!3Rj :qsOQ}sΡµu ʘ[kXJ\8.д%+[#(ʘZ"@'pQ|R Wҡōf_;٨w$w 5!JO]}VQ\Z8R |tfDaT@.x>?.dSSv b8[ VXjs+pm̎SE }SAh ^\E[ X9XIkA6*[sJʚnJ/Ю{'ڗT)0ul (R@#@dEE\BEW)Ɉ-';CviCAu}uϮ\"rovINKIv$͆+-w5hR 5oxrIpfs\ṷ\lԺ:ٔ-fHӞ;MjoL3WGٟ,> 0p%ufdfT2J6/D8OArj;2fl#j<@3P-jN3AL$ۃڗra4b42V'L" 58\>[uJ ̝/h\98<>W"n㓕b?:Ջs }E/Ǘ\勗x'WoOƽ\}e!P"tW5 (%w\ﺡ|yu||u=Z_~䆙?G7dz9RP5L8[ ኦ_nbwR^drA: & ~ʿ'/?o׿-Ձ;'[n9Ya^H˼_ C 2Klю,9E).nIV٭n8[xXo-[>,Aߓܣ.@sLJtC|z5L61~r^XfJ#cm]L3 )AJjR0#ldfS-"Ic:/p4/}ҷY=2Z'p䲠4?{I$\Hv&)nO&r/8  MxAݑZͿŀ-=i~:Myfb!N ݶ,苓ͭ{\3z#?G{*D@T~_zwޮ VN 5z~`UٺM*o{a~5{hAp9$h+8;Yimڥ+wMh^IӒg'fV Id FO*"|0Yqw6\"\ᄥF%xg Ҩk΂;-[Ŏ .G݁,YO!A,sKR,guۙ^M-#\>CGq~1fsO`aa+Тֿ g#Ha N0]" j^!S#7(}~`QQ%p?EmK:0飳Ш0zd#Ɂ@_y|}PBQ^&T'r*&Ѷߙ:3ӏ&W׿SpjωI f鐳38&JT۬YMJ||LUVm34iH5sx*6?&©eȸNq>3P3HKJfȥ܆dtȼe&3GYZK~Cvwn{kE̚c fb hJ>&!P_ڄbB h](_,޾Y|thqiw;:PS\ASoo.@ZsG?c=йehj8+/"ٵ v*ȘV`5!cWA4^g [Ư \E!Y<*| 'xUI,ӯ]Oל831'^{\zh(4 `2o\Vw!Ɵ1%8RbheF-.2(\b$,I)+ znIȞ"M56lg HkR9N;ʭ(TM٧I ӂN]^67Y11<)|Fs1 _ᛂ,  / 5v7'PĒQ4U0GiΣ9Gzn$*]REC4UasS EV̂ ?;O)enLeT.6>tO} t4\'Ќ2mz`Dim4s% hkS9!j+8_4@Vrx_Aԅ@@T!!D&#;lVXjj0K#.K}efd20 ޲4#p_K1FT``1Tz&A,KH!l%2n93:j X֔rcrA]n rvnƖףc0&dl׋/B~̴ʬU:y)3~w0ct/ xq8 YK >ٳڪQ|y~a׷q'o|1fv zbsX< zZ嬄dW;eK6^}֮XC?MZ߇(+PPxo"qՂ+ YTvQzDꎹ` <~*G_b} j ޤxe4^ (IRsXfsČH*T:}'S;`q9k@A(dBX?2$ȩQR1F)6E~5ƬE]xEBd?էl"s %ab1gl(ޡ.PX֞(륜ZXRoM몼nTT!i$B"ͫi`(Nҟ"./wүdLL)1^uRNqNzE$ #Bt 4XHe6>duxU# NLZ`!X>av쳅_~o_¦/o`ҁ匿dm}e~gO#,+cqtTq8UI eڏKĂD]JKϻ3qFݯ >+=VI:H ywӃ]L P+2T8$)Rxu+ë |%H̕H! ܍!q9z2؁.G\]A&gi\DtN\G8(?e 2 }CXG5DPJ*C ݰe7eTr`,,[D7DaK '>̏(,ii td _dI`@ auRQx͜d 兡U!j"ƚ¯Oĕ(́ $U>8sFidG cf984K&?"^Fn$HvSwC̟!]* c-s Nry(-!6z5>6Si150B4\p߫}Q^f\| Y\$ۑ  (q-8i.ȱ *Nj@I)h#9Nǘ0D$ 8OؽC>{tP?'Zov-KXOM4mziY (NV&*EcB[ְ֧(>`=V3g]/?<LO?j^1) 1]옦bJC,b]8sq/V] 3\8G\qĜ!@PB4X0u`}Fƈ,˖) l |d`{h哜W!Sjj{ۗ'˩Y~{X~lʠt!Y8S(r=)D/[Go{"ͩ]nuk!o:66neҮAhg/gro]huw/Vk]1Ɛ!gTvi/o.ytI>2ר\6{Q ;y* \ XLiz}` Wlu:zcP o+]hrvr`aG$A`u Sn~F9qq>ȶ{lFm;z/d;k:Dso FׯHP<<|x= !@}f0:دÙB:AsCo1[$3Gt Ht ũPK((,QiF:?|B*0(9Ԭgv^D{gOy|~vZ6nhgn~HU0Ɏyn%x%酮VT! ULϷwZn6Q"w w8~P6b`$<-7@wGS{w]xBa+vae?+m>٫mFEԅ' C}oz |Y.'7[?[wk*zࡀ(2Hg NGQPpig9%Rlp+GHF qDPmU0h` {!YV^ۤ{0l{5HVj3ZUe橌VI-/)ެlUc -2uiO8D~8n{$>J6w<Wb瑼 -: vXM>cAr"X{l ul]fA[\AEL3'Wb6#Y]Jl2XW ١e[y\; 2ϰ{ٲve[|j"@gbϠ+º_\9$?hkK– - ("V 5 r-kI^QF!k ՚@!R_65 ee=j/ZQp(c}z]"H}sUj 5o5*J6zmZ/*!k5o.tf`WػxfISwxDjWBHTlZ*hJ RM^ ew p3Dk WrԆԳ377JLwZ6StX}cl]ߝXyضFQ~.mv'*E3QIK*)˧ zjv"6P t!v`&.DCq]ևigRa[R4QM:2v"cs ]ܬB^]FME0;3qw Otn3݄fܧ Uݧ:wșν5 dНݒDCt kȖsL7BǸ4Vx%_EumX` ZB]_ڞN(.eҀPo2p$}2`p5y~:^!ǩQh:p'0L; 1_o(H_$/e_xuHđg:+ kxT+I0"K Q /q)=x>q0/'+ls/^r9b=P[sB̊6{((>>[k]E1%~x5f2MX0Q^0TN`Ëlj;aŏyOPR"xY32D/rڑBb>FGqqFܸK2^iGNx"L gS H cL2&0ь3èPKJ@WpFZN1YJ)&Q>e@7VmU7 ˊ1[r#-Vl@6Kpm(8sMm[qKE0;;(0|!(HS[N2-*^czಛdW}"-!X(QEjD&f!ҩ-c >0Ҕֲ|T>wI2['*D`j#VJ2_Ԍ@`=6,~mG"T ",>8V&9bveZ+Y -g?vM GXÂ?ҽܹ+qW\?UJ4d/Ż(؉kV90q0a҆B(c$ .A \:ü!%"4S tӒb9q:4G{ G,Q^k宽FC#uIj58X9 (qz^ v;D@ `& T8{" wxɄ58HFfas `smY"(XJИ LG3$^MnV3v'1Ȃ@(#Qr(iABRXlVhFt6YFautp+lL tlm$y/{lIۋEɵ[ՓCiaw7|jtY1x?ylO6=O:I_aZt7i=vX( ,~'#!&†藅p0+Q9]_=E4C{oh7B5햋Amc|;7഼jZ+j:$䅋hLjݷk]nT[.UD'my"@E~ބ`Qޕpm\(zerb455,sQr'7Ϯajf'ǘ.?DI?3}H)?4]?QVo|Tu0;O3@JܹZ(Z_ûQ, C5w) NNx`$b-9MOky_t>-L jrL2]Glbk!(cHȽZOߝm*ti&QDM"$8"@AhF*NJ[lX}dP[lL{Cҥ   *x#3''/׏Mp0ߋ+c/`}w>nk#22wk1X_J&Cs&%h[hw'yw@Ws"[ڣ{#(j<$^SGq}{M 0#I0 K1o gѻR4ϙ*"IiL+;bW-ƍ[,ܠ9X;|`/$|5i>)TJ㱞zR3X N;Xf>LJ[r[cQu[. לY_DQYfo{9L;} jQ3k=_J c$KI-gH"AI[N ? hmE>PK0bђUR 2(*AA1bgXm絆W#~إ›Y٘AZ#hk˔+"ft2*X ڪ@Hm)a슡"BBZZٚqLx1 gq˕B$"2ZqG}kI< O `RCa.3EK،`(J^tSXh@2بt8y0D,%B Ǐ VpB^ecPXE؜L-l"ZVO{0k=Qo.{\m(TVW7;PNtDM`oLƟ;X/Y,|s܂YbZq_ Zoz?^]=lݣj#[|݌dI0-P$!r)\ߍEVF%"T u?]Z%xuV۟pRy;,z(-8~SʻFB*>K2"m&ϐ~ݖCΏ U+T.]YG+Dxb}Zٵa/ *u[ld؁FfXٚ "ȈME$wMc^;:Xt[$W z9듇{nW>ukaeOaHl9`H{|*c=ox0,~u.ls~bDny~+|5y`9캬?ۻ4߿<"|}i<(hW/ZSv kOg_S࿁NjH j̃9Cq?S;Q.tȜY-n>[zİUڌyOD6 1v%o1!FHa}SSz @qQugN^f|@K Ɏ<vN`> ):<і6{~G ^:zeJ.FLf13&SL.;Ǯ۽`N#n"}#8Y)Yi2 ș"(0dGإ,][[B8!q뙌8t%]iKj闻$*y6vޢa: d2s2NWkqm6$F ("'3)Ȕ摒]J%RKbyÌwGO--_:¢uϳ |?;ǼpLv-KST[n =CXdgI;>k86_;vlNM0ϋTH?5k V#hum݇zRJ4 7u+nM_Kl+gw NhFcY%I)>,)ԝ~ h%9]vI0m_EeGP[fߋxLCɎHՒo8n8,ڛEe]v0:f)rgo8cuA>@([@8PJMā0Ƣ<q1n0t̄pйpFN-up(PSRMWT ʖ ~jxQ~'mce=_JIM^ Z`D[mkJ1o?|4wq~Ĺ̓g S0P1Kz4zEqH)3YZ)qс_:A "]DEca*j*:˧v65n˧%dO]$vR-(m~vaC7e71?.sీ G\[sO tocGǠe p ~:TG(Sߩpivb(EN/CV2WLEKp=Fx9Fl+e2^m뷮 b1 <E](u~q7;?{*Nj+7B⃶8笣7yQVRloKTAycscH} :R<[BiPX_T_ʦϦ^ȆRnSʛv^*hF sxvn rT>;W~`x/  2. LXCQ Q;,yoQn;AƋyKݛ 9&q80m13lwh 6TArO%TAP,Ө'klT$ig,hzIv )0p^(^5q_F׈b6@sEOq`]G?P@?[yMa1co7n6yP|c{xgMIй8i<YgЫOgf~r?K _&x> Nl!J.h * jˢh7cx'ff1HbƂNҀl2EKPHQ뀤v0*!pg)5^\aJD4oN(CM$9-gaۆ $Q6~f(=f1$?b7dž(f D9DEi$H )v[#XQ0 A7j~:i-J*; vsQLD,1 ҈D*03}@1LRH愲X_\$"PJ""~KtuNƕ>V|*7aOt\G<|!Mó!7OE㍿3>>|sVQO%ŗnIL3c1Yf^C4ϖMkJ@OBbtF -KaH={ aմWnɈ%sur]{tݱ[\Ztr2̘&سy @ G!%7b;:vlƂ`P N<Єe )Zv 7˩R5ghx~P] l/Aq9=ۏfv ƍv`G)xYEC8)8^L- UsKvig*qjp`C PՊ) 6ƴ5kNmZGeޭ~} sȻw;>5CNn s.7HW .G+3jWk_#ڔkWBD'%(ÿ3{349n@@E-~ܒ}*WH':PK~8i2á! 'A˖rی,CX(霷R!k[vQv뜸/B8}W{G vޗ]l69֍_kI&GdՒ,!ɣ ˭&bE!r5y^y踕c>HлMˏBΎq*@E\a?/v`{^}elXu1kD*u!߸SpDŽGLuL|x~{KÍ3qE>=\?>\bشJzxWؤb˽LREvXnQ{&hʃs# ȃclPfX!߬5k%id(q*AYm>u i7_nT*+&lʸ 컖`x`5m_T $=Z rM$cr5=c"< >H4lBٲV\^/u$NhC'=_K~]o3dB]qmf ?Nk^6ӏrRAO F P437Ycd`FiS bYsJY>u#%\[\|O.CpT|"rD -E0E*W nn҆z%p_y(T+WPd~yTWV6#P۵PYڬ-Fy}w+g`YIDYndAЙ`#Ah-c<6FKd'VfZņl`M0crs#IDko?yiF)M;K彧-. 9نKI*gP{lrNs_;l׹+xSpluH@ U/Ly7N*fwT qqZ3]HYJ|1_t-**[S׊8NqT<&Vsij!GC>| jȫ#w#qZ oynU KDObU*KJ *D9@:G,QH G;)O٢k&KRUnA{Dws,2"`m.’a 0cy d5Nռ%1n֚``|1pCUCC>*mz#:heQ꼠 ,fx<CU۲քoeXJ-\QqC;s;ck"xE('{s_.,EryZfzT|,<2kYt& %B"E*NYn m<{>` *MpPzl:X9G :`c'M}!wWO_O^\ _)o?<=1Y6wϿrLPM+>X'.L%#LdeşI 0x qT#jX~Q`'otp6^*vZ}H?=݅WE6.<ruJƖi(C;yᝊp 'ixd1@7leZehBg拕ݵ37믃 dwB"3eX:Ҝ2b_GT BeJ[7377y6:۫X"!^J+g3Y ]Ƃ ltK8Vru!ZLL8-/tϏa=]a&{^~nx~>OǯQ;㾻j}yxN%Y7`JC6Tf,Jmt&(®HP.y6 CQ *E!IA6G2lfFpr egҁi%s_3ƽھoĢ_k=Cwȯ4%|Ι[Ιit9D&znZMXF| U):{cNͥBܵ>m6ckMdD⩟WFl@Z+?y_OmĢլ+S=e+J6PB &cɮ\C?!xnh>8-O忣,a!XWǫI yOWjW#AA5Zӭ=эfR6k8k=Zn$H# syδ۟ca|v-OeZSʕ׶F",po A2, G dp3q,/*)r!.T}!R>D+=qX=iXN4Zr2¡oķ%9a moo :qJd|۸-% HTM÷`^'GO-BUJn|g oLi^-ʨ"f4k#SR׫gmS 7hc(6Gm .(a\HkAkシҠ5mr@#ɃhyzLDF:"RP6b$n2Z9Nǩ&qmr>TWpb PJniSDnmKϤkU J*YjF;S:5)ZQJq?RFe (9L)}D 㕏뻛5\i}+ z-S26e7w yjqtC Z tlχm\Nf癶{݊’jI;iHr )$l tGD2lٞ5Ǔ|+ƫǖv-'H% K u7h4V;t#ӌ>q$fUS-(۟Kuձ:˫5jaֆŤþRd׵!D}Y5KVI]÷JRYB neL9c3ń1T.YWW]1vu$z$e7/IE耉y 5w$ڋ߸؀4]jPl*JF\% ҙ ]L< EUg ׍l}_=; ]KX\H9'B$$R~[xʤVx-I/ b%9J}AWPFLyG7&`2B9$ۖal)C3v&!hk0:.5Jkc):ʐ兕 s1[3b[ۈlȖ Ek twk[d=h`l(݉Gn(`m,(DE}H'&J{7͑m`y_b5}Uޒms{gɕ}%ӎIo+2k]%_dsiEC䎒ArogmO֛Ѿ-rsrUۓD6[Ÿ[5͟TfInSNh9s%iQtZʰكc$o-D45b@RbϛXi*br2 >RW#'G PWfMґ1W?k1uKE_'e{lEf?s㚆@?NwWWXո\gUC5O16޵6r#E'dw$XYJ$A+$b$(`r]UXU,V5qW2MjIG[hq`G9fp.o-;]16YkWFs}X1I)la mal>kOX:u; ų9:sݔay@ٟ`݌w4^\],ta<AeĮt{*Z u7Fwf%dt.赾RYlob^o\oeHđ\jO fEqvQwOUoQƩҝ[W^͉J5:-9N3YC B.:w}kCy87ʅ}!f?y­3ZbE7v3K l,WF~+cW}9NH-gVW+ X+6QS9o}SY,)c|fx+kpy8~0nyYfPׇN^ffO֚9+) 㰛|ڽ[^aʢ`Dܜ(3WGv1_۵)9t #5]ٙz|bvw4OzxW/>8#sE~{rpT rx.ޟ`R:;2X|p`2-"N7s :'vyr&kXB/Ruy. O> I k1c-5UhkJ^A_o9BdFs;kդn$SĨ>1;mh͵ۯpxˍSMkۭwl- V(VC^0:9dvUZCZW (S^51}FR+u >85>䌻-JvTAgUsD:>HQ q_ƨԜH6 9w iJdcӌ59i:q"ژ)rԷ [`d@w/:։?+&cYi;ފ\WX@{Ȭu1NTg;9R9dBЏ2}z;^/~={Y Hf./z;u k5D [{l |\F~0S&UBM~Q3ttVt5^W+PELM|,~0M/U46{_K֋dٷD +#Ï0G~ex0m3V V&9`Rۼ_txܽj'FiEywHR>/)1R0sd^Dpyh}冴_+aYGn.yg- 瓖q҂a\#*6OZ(ƂxҲ2Y8cYK`+g,-[M䭹%r*.r2y…ུ-0g1tDE,PPL9:B29 Ԁ!1:f9<R~ ҤvZIU w.h0 {4zZw% _tJ >fMQ3>?hY@3^(%b!p3 P  ? /Ȩ 4p_am)NL `܍e{vR dz̙KUmJ3XhR%8"[q@ >WS7qM蕨ViޕRl??^dFXU #,Q"B40/,) $FE"ؐ%D*T웴ij;|auQELT Zz2IOUOYy^.l`7Gr.cam8އ]:nL_.۞L3fXڮ!LO!cMr@旭o}fxg]]y·OhscaV r{~&=-XX,nLڑ/\DKdJӞչt [)9S&툞֬kV<\ֆ|"Z$SL'ab1v<xl҉fn-H.˔樖yV/U1"|Z{>!~3Sw7.dUzidǾM71>Ӈ}nut6]_/"Q[`1TJ@hudZIr 9#`_D,?|~xq%TDn)/ͷ ؚI1̗.//~bPk8o-Η̚zI \iDrΗK+3ZOl B h{b#>'6A1AOpxQ)gd% 5U .ŋGZϋU,)3}:BV0߃=I;hZIxxkր|E2w;S|ՖE.|= &O]4{*}G3hWÁ*ZYUtxIթ +MV{q feLi5 ju|cje 57k.!X X*ܲX z &aPh$DlD9p$\ VR",c䭔t%a b `δ^Z<(mBi\h!B\Z0'JB[ĬtK/ŒsBؒ^4NP*ToBkޱ- v|>ێ<޲EP&07va2PR.̮kߥ'w;փ*\+QRKNyxR?k ÞQ',icjcԶ!"WKL._R CzC+¢" ;FRYA,dNQLӀs 4%yPM :O?/7L| Js$XGMKNq_0$G;ޱQ=UrZhJVt-daVJ+^-7[O@4g1yvyUF5ϼz믾0|0>1Fn6'xrn D-;JՆɹOxϨQ''N^?wp.pV56GGG157:,`iDWEk1rt rsg9f(M)NJ2 *g B8IaMZ*VL^RER@g,:c "*XŽ=sjBkLD2EjZ)1Yk0"UDAKM8VqD8 +Z3BRmBwSrqJS&`Jc1EB0VEWp-s ӈb&j!P_&0ȕ'HmP-> \/}G Ow? ڐ/\DKdJծ>F'Q)}Gv<+fv݊'ڐ/\Dd㓷87#'J1 v;J'm !_-SLaw\x];/uv\(o]bJI%cJ̻R7unlF0@սqwieR?1Fᒕ'%{| )stJHMkizOǧw~N0 |Y$~`{>Q (?RqO+?ľ}nJ@BZKx1JMۤn.l :n%,ES*lϣtKo$b}RyN:i8i9!X g`r EQ3`0NII?c"1'i0 p!vʨy~~&s{qkrQxmQ>6MnSzE7-?2!$U&RŊGSU!J@fNpN[x:H.+{/q(Oa>ODWӪěUѻx]u6%C/? N)dA:gUY}V)sR )d. Hʩ4R{PȊhYa9#<2Twzcv *PelvM\"HDt5 ̤^/qN(l_0L`̐qdc8l %p] Ͻ\RSFm0(ĕ咹Tn _̮hс1#)ԯX0@XR@ zQxE!Z!jeR("6[Ɗ[W%m0AL8:(i6o_RpI;JӚ%XֹM5 KlsAl4ph]r#9rżpiqI\r#ޝGް_;TAiRlHJ,Q PÙB$I2S\vLFr vf=SwDӘZ+#*ǎiLzٛ?GJal\ϚY=7_`ݲcXNQ oz@k)HPDz< 9z #ݧ~{.h_nyeۛDI\tx/_oXfAk3Q"쁤w~nHѳ@L#D#x}HG&l{.zdd>}ܦx)"/E7q}_he251ۺP4Ӓ!:47bCr[Kuv,h 2o7k!LXEB=Ќ4hqD ř9sﴔvM`4 F"Lݓ$roA7o8,Ԕk*`ؐwXt37hWpb!ˬ3t< ~J b/1nIƩYuiYOݐH?9GTEFw>,^ iwYn:-q\1㴋ǟ4uL7,>fn58A B"dRazΝ{V;9huMtPHM3z]J[_#?I(bćq2(V{&0~F/xXz=2*kHeWZ 1t7;:<.O?; 1lӺ"vxOyc5#ecÏLMCWPV6|`4HlE lz$lX ;WX1}~V =QR1]Bv r]€]{r@AMp[ Ac2פA㋥-ݧh>EKoVU3.6Yqcd†NwBG;í1k4(AcG+d(OIM )D)_?~9ןްi6r1'D=Zp(N髮a4,Z׌P뼐so۷"v]6jsæ1 F#a]erƨcચjrJ$pUa^vdV}(BaE"%sf&sdiWFg$̝w !&1㏔@kM!",A۔/JB_3&/shtNuQm@Q֮_pMZz6ir`<~2Muګu .DrσI(tRF:Qr%!zY2 >A &O#)V8}@ꭤ]w`k۰(QGk<ӂ!)QokSgJm 8* ޑrt Tܢ,|`=99x"VaSA~:WMu"kJMy)i6vuæ amīɺ`w_.Pa'O_(z"+Z}J4ħ9@>,~m~cV˕.G8˟&SX޽V7elȚ=@bg̕ nOpn`rnP_@HңwƸL~k7=濗B8Ç㷻LWgqSͧ?z6G-~j{/z]yHP#PZ5|M{W ;گ*˴M|d'XUR=G8~5aÛ:[ulUsz(.Yâ IS?9*>(kxZj]+OJ\$xI4a! Kw"Cm=!RjL #*wD"`Â?2 Υ%V9-XՅpC[*4m;%]DGw5 yʯafeLF67IUsH 3J,+ 3aD?hAl5yJ)5\1)Y[SJW/8z-B@׃37*},=-v2qq g(݆4D`sZ~XZ>pJTB냱8J{E(hrD'#bgAb$?Z9WPwoydL{x s:a ]e5#FK);(y̧Pnd޹~3!N{cۻEkv30c.JokoZ{4!@&Y0pV_ɞ7 N<W7YLl mrfTkX97:x0 hGtaCN9*`lo0;7ػn6# Lx=Iۈ}(xv*ox5҉N莚3jL5[O!UȨfno6") {PWq3mlLziqeH'fGI"m+/9) ޑ^|~T"5͗w粔Nd3#/H%UN|ΐGPzrkXl?*lF<\ Vӹe䓥SdjWMHǴk N`EӪ; #qR  %҆`pIE f%5ybKRk/g7_k( GMmhecf#m<9QkWF2e:#3}?DKE#  kYq Λx N)ɪ.1R,p%sgǽ+Nv40#i -C2{xl'nsty4?kWۙ{8U ?? ]c ni]![h*l;_6͗gH:irE2 qAXQX+D%KQnppZ<'Hչq!P0nHϮ)%(u#bTpKQwL!Lfr $ʻ,!.C6䂝/{asLdl8iZ8 D T75Z֦9@o# R"̫Gp/x#a6#nsR6蘎d;^nA|N[sC*E/a3ƊlmXЍ[ ;^ M |#R`tH4,$4Ή:c;^OpN纱Dx %烸2b<#aBmm+!81xx\L9e'@}aV; IurR8$+DWIƪӆUjf9 n~ &?]0P-P;̆J])YlWC]_ B|}|6_ r9jY`t"e3)m!"K eiNHB@<\ .KU>!*!He{GCDfQw@q]mō/xIDF*w(-<[DKNJN.`I*@];~Lߢbfu u<>*DB2:u:Ogu _t5&Öҫt6p`˕{/I N +'Bkb]; Ww49i\2hSXBjBܤ`(-1;8\cewCѫق7Ь|< ^[۴_3\G/7F ٴQRq`ϗ9G dP:{ ozA ؍V=tC'K+%`*0 N߶-ӭ9B0\da]qBː,؜t'n_ϑF)G#)"B$#;V͎;<PYNcjkNyЌtTj_\\zrmxx~'zRd & ^]~r9jITN飝8#ɳZ|pخRk1-G=hu2„!@rNb/ŋ'"K֡1{Wܶ Kvd)6qcYʺk0'2IٲU$%,gz~}L7dI,dIfI$J;dJnxƒť-9i6i+J!S{>)0M@yHA8I s)uLi㒹16H0̾b0 k'mND\ᤚ6յdݫ$pQả,ZfDPU-6gjM?#D`RDI' ,aD7/Gx6l)7Jkeh4b8 %F A.a$A$TcmT irO,.g?ݦ푱dF͊36h ZEI%8Ak=m,[X[GXۤ⛬aER$95D3RBȃ[Ѧ7h @SltHd n>FW)Gw 9\YSbiZvGDs>rF*P;&%X`Y>H%\1HUd%㹄x˒VX B^Bα`nȴot4\W!q4(34 PS^ւZ(hњK3'ЛFWgM 1.6貚Q [ #.Z^%*88BVsn3~ƒ!WhV7JE 9~V"U 鑨`R \U@ A +Kjx!(Jf J]B5i#Z]}gSʴ{~!#-b JQS䐵Vd;f$l gjVN|(ӺRTkTbAz>*$1jT]>t=/H} 5SݤG/6H8OP|(J9>1R{X}ZI&X`Q8xZa RcVK]=Lp* 2ut~4@@qkaɱ7ǶA3mp5V8R {#j&V%'/VqO1y:j׿w;͖`%y 2[/w[oµMv.!Yp, F3u1.A-'0ic 's >y3R3"(]n+ O)z~lb)BwN,yԚ&VT.8$f-6`fNEb"E)jZqT+i%֩x2 %„ରPsRʕA'0D !ķNs<y$/"Ou pǓ5E\6&2XoM:3 #2ZQGD6L}^-Բq~|%t0* 2?d6N1<90b/ouSl4Rtз0I&@i:Ტl/VŊXQ(x a!/UV8Y yΌSRCSiB:ØλHAƚ.ڝϗNgcSnPla0j`r( Drl3;F?gǝ\Rk)1[>nřR5 dJɪ(7ad {K3*L ɚ]EuKEAv+-]J &`N8Kc )rAbmyZxVmdс sRFB({HJ־}XpaJt(etx+eN3eLPϪB %+朩@ iU˚Rf*(߶em<<Ú& !#KjaHY\,djo`۝×o^o |N.H{={^jn޼`S}CC/_^=|o\xNoo$[Wžu;NBv. _I]oC38xu{J*t_#3oOi_o шmql,N\ğ涰,/^_Z oU}{3|&g~;|̎?}4`z/eӟn&A£giv9,y@-G򒩯l?NٟLLhpW?ccs2SK6ϯۿ=ӍeO-'~>|Uβ}b/GY v|>7b}Bq|6;;<jo`|4_1@ݗ8ϲt[ƉԤŎ.̞G@]Gޫi)N=LyuzN(IJĒ&QDXB\H75!g@r (V[PG6ZPz G 8.mme,Ǚ :mݢVpŧr"ԣ7Xl \r-3 ͹F$E9f5VLjㅦCX^YT^-n*CwkʱQ WD%[(ymRy* LO_u 0R\SҲ@YċzobL 6An[j+aSD<F)'Z27 +(84˥J<'m%+j+j+j+VVa{RT#ʻ[XPJo:r\[,QI 7O5BJI\ C"U` %E2 > ZST0&F6!c$JI LXY9BP 0x$f[t r}z|n>VǷǿIع7H 3^*w4?QK*"+2jIn7C|1I')+U}u4)FN1$^ !E7z5[PI9u-9$r*f`uiZHhuS Tݭ`0Z B,"YEeO2$UوA+,#"l v>@9BM ǹF>nvY(;KcCvVT╃]<ydC2$%3|6ăǍ Ai01I+ ~d5:iΙvԢht;A 71G]#'sV|^LC `A mS 7\)X*:xaeg6HlA ^E0[>Fˌr7YXZ)\FeS]ACh3PV |d*gHHCŴ88udݧLTzpiTO۟ļt)`/(`) =}?RW{|0;R ;pڼ#`@(Ihܑqꏃ29D=SyC~<G$6MF?̩}Z龦l;Um6gɍkf>C _5v3psュ xHFG=WiҽƎ JSSX鞕W|MnY\4:P~j#z7WdM?SgVRFՏZP:Ҳڕ۝C+]pqDmz xV*+:0:׻Ɓ]><&4<׉Ѝ6?qo؇%4`~x߿p~=?~/_9;!g/%َ~46wLqq<&<V,CNK5ixLQя>N*j*  )Q5ZL(ǪSE6=ף}<ѾѾѾmFVP47OVKOi$ O,8*D]}2g :Rc(8L>qv1\7R6n'a$thpHpuСCC ۃ9d4G-W+zr)4H%ȤEdFYPB/š1V3 F践Ơ}2R7RFsX*FUnCѲ!VRJ/[QȯvhFZ uʶ,&E+4*Bz(&Rr̉EH?`'<@R 40t`T%3O~[nFԬrm-k@5+cQIl%mGa~w# ڀ4Mi\| $،FCE A Z*5:mfSs1x6ߞ3bgΈLj.-&b֋_j]ʈVRU" jltND>#o5Lu?FqqȈCZn̾0ڀּ ?k2Zz% @tXCN>B] l&h.8pYqt1};Ɖ5qg3:"硭7[hAe~o?b?}~i9*OƟz'a\삵&SMxuοz}ɾ}ʶtmo}?A'w;ż0Ǽq[I9ߜ_3%^?5F0;3ట>?*im3,S'.'lŻzW=F `e&;ZٓG,kesN(5/uj'[qğoOj\^Sٛsa7x!dU'%x,q I;63G؛;zsxssrZ9G4]ί qrA}F!n u̠ vmjW?оTp$q;-֛vU1vMTOI?~?@"_6_ˡl;~쫍GKA;M?vsi`銧"m짞{BG#!~DG$lde;護jE'~3%jūM$فhdO|ar#;CJ-КSڵh_, EQ&Q)4 2Îur*`iNNDN{kN;v:";(KGޓFy9bZa; *L)K~kkۆCWUbmZۂuF#rGaCyYp>D-C(%KE^mդe>&oF\+m=9p$t2dxHdxN ;v2d=2Oؙf/SZ h4Ya8 AdEWV2Z?c߭!X-֔k G.2R7PUFղ-Vz8oI'#^GeҔsUcMƥdZ&g#ٝñɰAe^۫N ;v2d=2wPֶc1|1J "0H%b&%TR8GHN^<-o S٤F*fUcUsVCg5'wdWk4d#U2I]Ds5bNKG6C+Sd 'Ɏ.J~N6O<;lN:v:tpEYzr^E@wnyMO cQ,^4 ҳ⺎IЃmG~zH7iXL'  ʨ$UB' j-WAF!`ɡ!í>ɡC'N[$yeyτ?GN@^uވd EVQcN |gu am, ۶6ݩߍQz9tqHG>:|lƴw@J|tq)yXA,E;sͷ#/K><ߎŚN@Xiyˏ17ұF}Xt9=VdU!h;K!JZ3..nMΆ I[C([@V]6H2hw#Հڍ=Q$ә3!1j3g;o ۴wVujRo5].x?$i>̎wiCF`1ؘqUՠ4:0 Y:IfWde=YWr"9I,IDc3}1ߵ@ ,~mؖU٩?XFOg%5 Gx A_DeY CVPxV!n&yxSǫ @] [-X) ?ŞRAZ'rioݞIĶΟy4BP=nRv .UZeR3n &n|k e#’/c`yb$,є(0(l?ü}wcʞ77[vYe)2`gWm2SɌ33h.c;3 be&ؙ|,^WY> Z*XAhgJQ!u7Pl5hs7F;Wq;-f6{Ёa@S,7~%dL`xLb9 _;w>EˌϽ,+fꥴNM( Abk5Vr O=m9t1"u'r@]x@ fg`e҇ C25=1BhV 4NEx,MR(!xD%2@9fFHڹȷ |< }ҩH@DX@I@ =#[)7:f,%^b.LJwCqaF'uDo^M gz's~i7/}sCF-Ns`o>}_CWZa su N_(3  T@)?~,e`Yb;Qa sI9asFa˵1Ye!3,XWAZ/AUƷUsYLcze>ܜrJ6 D'|&IJ_Z`Wر::>R! ii.<2_IW%&\>XWAY&J.JVʖOq Q% )wM T]Qj{ yWWvNq5Ku" kSlH?j۷N_vی4GϚp}ya3kz7^ezI Fyn =֌deYDkڕB}ÀyR}YC :ԛWExwxlsw/tqf7F yo=™wzq]?=w`s*+9KJ&錠U\LڳNC2?^wAu-U[xpqe iDXs[K)@$פ36U|c۶r<cJbtD'gPT ]?nցCq(~=8l@9+ #m_&|/:F5Ĩ%ʋdrZ,6^=F Saj>VwW]oOVOCO2F\cJm0UDcHqFe+qSb~#.)F\#HP=Riڨ4ѽoL8=,k3K2,|,%iqsXaMZb*)%9ge2 i")H"Adl^QB8F?&7j4gԈleRiURiؚIM&0rdQ-o 0ڮ,V/`8izF[6$]7ؚ͛EMxV5N MtqN"+saIlw2KQ4 %V}^~pnfE\tWVnn-#7ZV 9-VdΦq{/F0K/r*I({VJ\)M2tjI&q?<ٚ0-_&t{0Tbl4L{:DF[M'I:AҖAcz}I'^(kvoPZJ3-q:ufLcaIÍE,<Är91&?&a=ȯQrWVba1`'@6A̳28LMrs hMu"םa>bsi%gƴ`ݸtwtW:y'j`hZFϖ>a"jHchIin"K FϠkMeF88V hBH/q ]O5JfFb<Nh>U)Ԝ7T.'XS4gI*)CMDJĞ g+ +5J (9Iː (A&H6t1bGd.#E$XP-U ,-Wޅ a%Ҋ`@TBe)qM9?31V-ƒ$p K "lrxK8Z {k-YjIQEַ06hRƆs,+R,o/CtO&4+^0k3?|fvTNCpQ=YOϟ6!AmtY_+%޸)uM4.pV4ᡋ !G'hbWk;ݫ\YtQ0ߘn!NL2ÕJEM8g*l@yh9B': QP">*B-$Nq>6Tи8y8J!^C1\|$I+҉j*IIIl8;[ޓqpZ8>|>7{/_n\ 0޿(ý7v~7SѿoO{o߼k!w}s|7/7?h{qGo7ˣo#Gޯ^ ۿ~>6FH*|s4LWz9D>ڸ삢z;~^_uu%z:߾>>~?~ِu^WUbwhL=|Sy$WQ/m/G~n[(lp:;ɝ_C;F2uZ%,ޢ;vi8N&/dzr^-? 0L%v&gwZM{U !hwsN̍뗦kaW{O;z뻃OPSxl/x <нhӫwa}_ |<>k:_&4ִVƴ^:n|>B/*k;]x/k$}9ʎ{}ht}|/Տ_ZWF*`f1G+o.Ϳ)/GŊlX vpnW !5WL$Vɂ䟭D9Y0UBCY8⪥B+`P)y#Ep)?@$H]%Ek#TXS{M,',TǸ%M ΤB?Ud5N|MgHo%Gq֑:[GzHou: ]Л.eY'M?pzn@XXBS nxSh{* 3DQf NUY2#%"fnr͉g m}%_u~B[_h}Bը¸KN[9MJ:)rN1B;&_lmҦ/Ol`SCfE8ز1 %9m!Ph4?=__ h&R#(keйb% )%m5 کOQPFAm.BqP/O"ԡljO@*b&"-=oV,,J}Gs_޳t_iCqdѧSTvcV>NdJv8mжihY#F@5MyN!X)F>_̟^^WQذR*uͷ"-2љ]B'&-CMήYYc*lrk֒ݑA[S1Pcba j 1Pc 2ЌC; N)H> `W#If6=Yy/(5ףۧB?A {:`p,GK>;+bL%,}/):HU9YMXJ9y(mOFBvP#FB mf-nL :88BU- )d9 DJrCq Z-W,HW(pr>a+ `@ZDzk0%8keR[5^;RO a4%9~mO?Is2wMT1:+ⅇ!xiN^X-˜D]|ZgɠO@w  hC7VI J *g+SQ[itEYYYLj s َ4z\S6iӨRϗe))%bqc[g&en?U@`"ΟT/^@$b)JHk=].&6SGtدgXosh{~= >jK09 F45JI, ʀjbr&[JbCAяS#Mm^>8 .iL􏇛G/ֆ f>^%99y{ku`E2{!52kdȬG5&]7NsW);XQg70IQi+@$(" 4fP7vgX>R {:`hx+o%;K. VkwovKp?欬.J{VQ(k"$nԠ>;QCVx.96wcmӐu9Rco l6m`96(󜰍>y?7 !PbAHBH[Deܣ+n+Z$1 KFݣ!08]NY5tr1hL>$%J|IMRsw%Nc@@o]m[6j 1hڠ{=T $#c2`u"D \@˸GDc.W-cݵhI0͸|WTEt2;B$g&6l= TrLn-kmlIF@v P#F@mpxu2++!s(EIy n"RlT\O?Ò*Kq/t*V mKZ RZ#Zִ0q}#8U=ZX<נ}Z=R5#h$P!RTw3VUB&,k\Uzck hܒ6y5R`y5<ؚ7<8Oe=,Higkfe$E12^RQ{[i D{:z}@>$iεl׭XChlhR%-ҥ3NcD/&Ckɇ ي2}$_4iا;k!R:`5`e@DpʶoqdRё܁ٱK{t̂0@JuzQe 1Wpms($i[[ <$K\t6Q9vWO:w'"- i/9) 7$E7H[ j:gG8{b<=@W^#MƸ' `db`pK)(BjCP:赤j[{sAݺL7s"Dz1YwdG~Q~ޑ$=dS8:ˇ##`|_^^>N__48,GREQ}gB7F.r>upʆq0?_?+?{5⣉+u}3kDϗ׸%n:%ZVڞEOqx.m{QLj}eԺփP/Ο3AJ`E Be06;Yn12H`$-򁿲D{:``p%A ~!SZ/5#)dU6&ؚbPЛTD2֒Р=M[S6ަzZoSmjM 6@3e^u9~8UCQH (:G`pZ>v8+ Z$K*u?F}o'#1GFCx~2VC_>iWrng{63Zg F 4$X#&h29#): J NTlm` ^WϪV;EGe/m8l''|G. ;\EwٓM ~g:O|I>\&FQ1)2}D$%ehcZʱ0~ϩLuRMLBy~ ۳|&.p^C:>ba=?d:cqƶe L㞏)3r\6CGA?|x`-&HzFx§[KZRۓvӋ%ON͒{iInM%{QeJ))|}Pr$ׁT''A%_xCE+=NK0TvYc];޳~ޓ﫿1 _?wvYVI{i%Ѹ)k@y-Y[[ynmq9ܚ d5\]k0n]O}=㸍vK#VO/pRz vV̼aG@qa1@\a6BeQ-|p^*5{v~ӷ;P+x{VU \Jdk+`D"E7B);l:LgbdʓJsߟNNߌ~*{]]`?~ ^-7*:c1tC(D'N'z>rJ5g % 6FD0Ax05j]%9- WEbC5B^1X+a,؏H)9U\ӥv^ϕڂ&h wuK}^<ۢ?0/Qs &)+E@Y"1CHE2PQT JYP9 kզkh.0-+m U/M @Jy4_!O 6r͛_zzJbfzkВ!K`9]'Y3P3ژ*IIPTs{Lh|rZf9Qt1SKn%,/R|C/-jآ$113"\x5L21HEdЩ&E .(,h3L#j4UTr _!Ǭ"@nyzxЍ^ء>:3K[<I}-:a%eUg$mIO6X*Λ$$Fg%F*\gl?Ϭ"^xT>q&J~G^~?nܦW+2ī/>}oJ~{1y2i7jSt]nIrB^TOFf%}b}TjKH }EJYU.@h$8q"3W/,j8wz4/]?ޡOғ/-b0\bWo_.GcMp/~w}c/::|\$#W?yϚ:.dǏhA\ֿ~m'I2؜Pq(@W)Psd"H%d%AI Jk1\cdi (E,d'@i r (O@iu-m'DPd6\{Ai%D"b:_&:%[²H)NEh3PUA旃ζDɕtS P j̍"劢s vB䍁yn/ I P!F|4(Q)]J^؟zdʥbډgF2 ;,FUsQ.nD-δKGܴ!ٸƮ+%eT#yD3bɦ';YED)yx?뎑y̦L ({^&W&]N2q($è$gL3zv#\GـjQGO_>|SsD?.h|Z6EzTZd%tM^S7ׯ!띞akF׌YkcLJZ;ƱV'={R龲!Wح 2̹ cM2"a,'߿ߍaӉ;rH;1[{}: xҹZ驔}OdJ1Vl P@USC2<-Kz5KnWtD6l4gX?21"m4`Kc Mo ]϶y|d8o G\|Cŀ*$%ɢ :pf+ϾKk5szC!s~AL<*USC5 )IެndBjN>cc >=1i 3ߴa48tF @ѫn?pvvd!RN>Tϖ ?NFQSoCqMPƑ+G<߀_'t;ibOqdiz$XnTFqKi䖟'SI~x&zlyF\\Q8.pO=VvU\f/ ꟛ ?cߩ͗m^o>|}L;V{p:rz@ E5"y) :(Z5lmDEd_ d靜sl񋧙i)̓@ 9;=¥*9&|νFw]O|GƎʹhb.4x(4j1ӈ/QLM!>ۣNUY{oo?A 8W*PXy(!bU1Bhi~?<^xDDKd?w>̮ y~`;!6dt4r -1!TL=@v5` xcH*`ÎM.Ѵ} q-5F;-'vy>^\ahb)->qړľ MLw.;q%_`3u(ݵJe=HSsx2Ym NVd7@C.=-)͓.wl@!}NKԖpEwħluk}-lQ\ѧ'6K 3B؁)r\+w0(`kGP,B`{ĻGpH)M5 @B;>gIޫ۰3'bԣJE4~t _1 GmwĘwm?ogMޡjq~髚w]}A쇇:yoW }Ư~x6?޾t@x'7/^^궾:~ ՇQ}HކsWgQB>\.Q\uF_7&W/ 09i\&QVVHvP6otTՋ'}Ǜon?OqxY틘\4fϳWh .&v999{@`Kmȼfd6/ShQ2cI~q.z\nC{B5e'(Aa?0¹ %Mӣef%p'kK56KbwT#/wv(=*3 ə%s!8泧Z9CV"J9K6*djkZ:s6/И]s¾% VJIt^d_Oq5^?*@C]93pS]qʓD|*!qY(O}/tT`TxgŞqdJ{WB( ?$sgUzP Հ =L*.- $o5^?(phqu #$&2D1¤c˟Ooӕ(ep & Dr !ٌP,&'xS%0~3p6J`R2Ck素e2VmɑܖrDs=v?Z6 ےHA sB2/-VX{%Eִ `fV]UŞ2Ss!Y-J} iv.(ũ޴dg|HɝJG\ +[Xyu($AkP_ u^ȻSL:g.E됺LCFd'c< /Rgm04I LbFFr(lW| :,_~Ń8&3%bF[-yY/RoN@IbPvꙁJcJ3Avxau𬾉jJUuV-e?6v ?&5IϳE+ fuD8x?cU6i*ՊCufB\j۩gTޘVU N1v0 =Vo:;[&=h! lT{> Y;zZBi1 m;LÝ&ys8ə0\h%TJp46Ugˍq쾹~ #esR/2rS1 "zT13Kc!HvUT'Zp!µ/Ѵ IOSTrL= 1k}ƠY"ws5GJnNZ(Uρ/(ZghjXj!WkC$*﬇K\Fy o*ɞMi'a\ a!٠[]u"ĉnum[+BjĎs]" xY(vJ /_l;L4Б?J$F0+nlъBc+wX;M1;-jKzM9(K@`? Y=.m'ɨPlx^tXbp.˯TYI Nt|ZW8DP) 礢ۇ`}E7)X^ I,QygکG1(e&(ezQu\=,@"ͩtk9|Pkt.\kcKYzrQ$w7±9EN("&>cTI|^~4\+dُFQk&'ob6r~pֆOs!%]pJ . ŰSd+gf1ntY$u̳ڔ9ً[?ugPY頧vqCᅇY=e#qgߋGQP *zdN*~;٠%t)5R0_B3eP]F'n՛4;6A[sz+b !8{B?]DmY}+z/+03Og"w."VN9D3.Sz+4N#ƈCqeD[B(αѫ~T6VSKSκ} e~?4A/Sm0f[%9Zcvƛ /"!m p¶;Ԩi"t3Lieo??|u%cKV7w_.PldϢ`ph跺MKFm?yE( B: y"1o# `'jó/ZlALf#;ȟ?.Qyg̥ i(nmZ2.=6{W9%/즽s(Q JkV6t{*f]Z-vbc>ZD4?]vh6{v&Vf(9׋Α*Ԗ1;l5v?ֵƺZ|CnFY>`#řYyB&sc2ى0Nqb5d \n$I'%z骁q]!rUC\. ]ǎ!w}S؎B&hXd>56Ewm#Nn-8R{>Zxn_g~?jl+k>3a;̬5n.2g\$5\3@ìmmb`RӠY)_az3]ƶ6аg$bdQS}~1f̉. 1Slcg,!ӔiX:˚IT~_H ,X O 6s&Ar&Abm\ ADXژXSQmhMA*vh 2MX"(nG♉QNt`¼⹿q[^|s'i<ۋF=+=[߉X8b1'7ŭ¶s6pW1Vy{PRdٯHKN6C@AܥlP_GN(.Wf "hom ƍ ÝebHj|͂wt1sUdq:])) wh6.uc:6.VJ>~9,:3iY 0?-v ] q$b ayEF HaD:rbB95GD*)QilǙx2~V_?/GlMU[?bWN!>zFkw;gt(Œ[r2Frj72!gV~N|Z!~"T`,0A8A%ƬzNU' pS)Ő:8I!"sO.nktkFBL=h; g`T@ؤ9l$`A?Xq\܃u`n!.p(2Zm-D=\Qd@PL,4a9AsNܐKry>{PG0 wP^NۆeX~LOW}߯g/;= KD b .^zpnc*8HSE|'  bN|Psrd(1\=̄w1\)8.^0L@w-w;s*v*Q׉R*Y2@ZXQG (Ƃ1V+'Bwve'#K)n*rGKmV4-Q74|%YMSVIez;C:yVTKq4,j兮A2l6L!r2mCQO'R̺H8q ce M?Ì& l^0+]sج[P5čk#4uFB"wpBW]v#×!C}Oww:RSljk@G@u 8*hdQvBky'6t U= 6c73Ҳި~lUqo{9Rv$RG9ˬ} \E3dIUL*:vįAw磀yVP=K72"0$,/R+\Cf8 3ix<)H"E9ud 9^SxAQq_"Sl1TJ 0R@%NA!rn>լ@|8Wh(鴂3Pn"l(99.\].d[<(yXcx~B΃ϥonqA\Qeye[T4ِygK<Ӱ=t(P';ZugR8F@DDϒN ڛ Gr o` <\j`>XN`]}x``1 ui֒F JBp1mS:3T9zP%`co2XƘ je8*N^6gFE:u +={BB3\H |#qKȪG#w ⲹ'UDǚTXWH\[AZm&5s8rVCr 8D75!M6{x!I PU 4,TP͎rۘ6@яlԟD99\ W# OݲwjYѐVL:q - >މpG8QJ*H2O7D;..+W:@h$ dm­ =DF1ʸo>ispX6j(c3AmןYQpe#p(Nb5D2̮ 1d5񢖰P߶DWJ"~XUy`\C|Dwx<:K #At5f5b M}dP 7WD1PȜieAAbg<dz!Q899"5f&s2b!Q'&\q{}.2ԧӞ M =v9`! -C ~^ گ2Xz%^BwFY[\;^\V4N&Zε#,B5$ld},d-yA0rBoJ ~塖ɐ`?y*T7UC޴ u7vt Sd$;0Ia d8 4FkX1@36T-6rxBMkAq BM \8)yFgAG;x$d9܎4k=79j]k ¨Gt&R /rduPJyPrNotpIyΆ݀_(i%8-*\0._p=Jk,Bh8D_)zFy}KHz~,vD۲6æ"vꈉos=j00FLW?wD!v;#]K?ddg/;H~= ?&z3{'4.=q7*~xFi :ǘc+>rs%eQ/DrSV. Zlp1_FɏSC:n>ԚwYdH۳&?Q2Ohhc .{2:]}lwj q66vd81rP326{GŠ5H,Vzgk>_.M>{|z32?n_`h?cٿ҇=VQ@u8$[2OV7,~{v%y%[x啌 hwgwf{5=.])??}/4t1Fw8+Cx+7>;_/ioNzg@7%@};M/n\~JyUԠryQ]dOЋd4 +ˠl}⛃-ZYм)$i C V.,:*reLrl gׯ^l3qOסeˌK,/RMU>οP(4Yb"3[ZA}U #ΦTen]r̤*dD1fyf 9`Sc'vZP` q|c*FFN"pB(*h. C)73 ]hœf}okCeLvL6FX{ez(sL+isA 9I+PE itll~&TeMU-,O5|Z<})݈wQ'0~X&G*c*w5{M^ߺ/{精|E_)s2ڻY2̘'Xb9sP5 &/:O1(r-ם`\"paÕ*nN7$EQv /nY'=]sIO 3#/7 #JZ{ tپ/o?{b//,o@5}|%cE{L#[qxg @22[ŞM_E27` w2\wƷ-wƘʹUXk"%7 c)LJ7h ~^^KpW& !,7nSJ4۫3߅yBen5grEg{ir.+A4puJqrV?ahyn HLh7»OZ%s8lOBiO?I*CeǮfwwG$ |Н2m)ǝ0jq 3B8D!B+#b[īi>ex}ꔧZ\<;մN8[bOa}29Q?&EI*1=߸[@l{s#cmorDSg6ۡ[]PtsA_*nP_DZȜQA*:#f4h`%Ho} DvNRgoZ~vԈ~vB&Cl%b`Jf-kq*siT)F-ɴϞ0d.0Cu~gkCuQQ5E5#AW\{T)э<% fz~P0h?$uILh"wm8a| lTn<5Mo/FR[PMK{&|(.T$ ZtV:*G׋ruo<{p;0p m:** KpJGpeuW'oU{!3U[%FkLujj! D h4ͦ57vހj[ f.]J*WpZ䋉5Qz+x:b-KT/.R^/^xs8Ir&?4+:T~w'/_< 77'Ʉ=x"l:/3\48}7 (\ޢrOFTR`&កE`T[_ٺV[i8bf+FwQ2yY:^t~=P E:{+5]2yﵠ;;i{nqqH66"+[ t2jsA锵aD,RV`(shp`d]jz\"v}gd zlۣnq{3X~̚J9ĀƥDBwޝoyH|ni۽! $Mɳ/CT x<~4Ƽq񍂕A6ADdd'%m۫{dF=Ifz;Pa"bv_C)N7cPu+9",+c LHu[g3Z3Oi]93ء` `rqyoa{7Sh]-%C)Db5@[ga^N7k8~sېdq{7<&my!|nuUJLTu$ X +VH=3@ ± $Gĥ6l"QXfV5.w-G7T0'rIcv'0Rܵnj"F%׆ )rAs5YЇ(NY٥JT%*d *:n+ ,ZS3Ld-by%}H]G0YFM8M>wm~I׷j `E'Jt\R;kLΦtr~&`ɰstRQR<$0 [- }(Ot# .;= L)Ľ/FiCy,@dsTs j.ŤT Rhf@a@)EJ0{bk<0' ij8/1Mr[AP)خZ)?]nПŒúp2ŜɁ΀1 MQ@ LPQ7P3vح:_ w6KVܺMH%trOѣ0gkqny(8yD+,UkZo#5,<%1q/w!Z9%qM\CNe+6Zz9z;oOzKU-Hޜ5%;Ai$.B 5`8Zf)U$P !R`iQd"`nl59dꭹ& 5(pQX5cԚo`]aRN#*d[EJ48][s#+}9:d@uN!'K\*`nE֮%W= P"jօt=Z(۬-Rk(؊HfcR#)3llb:Vw -fatҥʊ6*HBpG[L*Jm͋tEFISY s>E# *^u:FfS0*&"D0\ `be! Y 'VxN4ۨƐN=*ΚA2 NҞ {a?^ա!ϏB5AJ´WyrtYeIzN}RV@') zd&kMy#%GX3"+L #ԹѮ-YvU<KZxQ 0BLPF(o臚1yF {ٽ$ *ʬ8,Bt2AY]Y``d>\s5 8A$&`TGCdKo3?$PrG셞"=]g~Xخ,B-O!H{B  }"Jv g v?.w8_iqh`'^G(MӍ/+0f[sčHX7k[otp[2;"ʴp:O%^p0%rٺm6 ˩)bNnkΡm-v"3kt|l]f@MA)E}lpR5GMd9V5Hb#T:aMCIFvDY#ϬVB# ,JNaqSFFx\Bc\NN)t00Km@5y4#*#;x4Ml`Hoz4M] lp8Fu6hގcj'_#XsdiZfmwE^IC34+=Fʺ{gdbX'qv4_qd*o]-ۆkZZj^O?#hV:gH57glVߘ T}I+T7 #Bjl0 [qK-;սAl=cpۑ 1Iɳ1j)Waビ7sS&E)X[U!hWf-F䔴?-rvVpNyqL냿!_ $kz"_RzF: X kr+3<'@9Cm]NAޖ0j=ÏȽ"Fzd&-̯o漭Of]1;Y+4+syysp]$^ |S?WZ9~Z2{,KkqyyZzܯD9aƲSV|ʠmHhvt]s@ <nNя-Nw cM7&uVtޥdQFHZFEjR-+\Ϋ͞ l1<|_o"͒M)ko JGZwIa ?{ޖq"|?;N)r=Jv=<~W1vR?>Sso rCTB^)u>yݻqVvD6]Y' O"l݆Wn{۔B%n (j]?4lIŗ*JB4>wL5?>ȱ?d"f)ifpbbiz3C3[Z2[?\'w-} ]R]c} E6gJKaC P%oMB9$އ !!U"AXiy6g[8`J|=Jk !$Us1e1r`!x!V^3>"97! >ֳ3Wkή3W1+a*(W0)*?Y {UYUXHl,ZXRV$+ho>, 0ڝ>jT4]#im}m*T%s Ќ&*W,ɼb^kxl'Z͗Rrǎ |9 %nAuliG/6CYV++ [6LUXeIv2 Q.Vc{uG9i3|[T8)%FMPPsm&XTa+yZt 4q4,m17*BKiGڗ8u(٥"X)ʆ[܈ {\c%;m5M|ss~^dK;ezKb(9*1 D]mI4*a*,G0H݊knv_Z!w's[+l4(x3+` MI:!,:}2ɲۭ J 9}N'{`57ظm8v3s|aR{lXY i|͌|Cе\IеWj+`VsW"I /Mam5[]QۃP4ۙ+i8[F@w]ޞyTO>29ݐ՗)<)e] OWO-^^@slT'Tf> =Ydīo}]Onʽ >*::u^:='b?ZM xROI6ڇF%͍÷5nxZ]@f7ǫoOga|-Ϣ?7_%uC)6< woVBxWƲ?snc*-Y}CeTLI2O |B^IJJ&/P~kk-8{E6+ \ؕT\wJjj󸡧dvjmg-6]?9I胞߿=i)\3;aSz^@tZ懫y:#SXaWG7N֌~>~{qBwb`/3x`?8PYc/ڰQ'}~9 2˩w S/GX>jQjqN]vu N܉_6睸?G>xM2al{99_nJv3KbVDY88d:$#,i iE#a:R9O]{xhn28 i޲AMO^M *INl!L/Zb;_K g:dL7vtsR N<.{]}V;dO˺.(Q#O_Zih=\Qc;"Sml) SG< Nq"Egu]~:t%:w6 >[]BVǫM;A9 W+DJ_n8v?A-_Z7KV9΅zXhc+gUfT BPlH:ST%+q{@U#xFZc7%Bj) 5Nn̢QHȍq/rcCPNPM8@q'-ɍZ٘k!JK њܸ2Zkr4Tv$7O6]S@` NT'rJƫ4zu3beM2ƞD b>>[mW]g>J?To'ZʬLt]zKOuiyt赊“ɺ$jCdST0޳_cV2y>x^-GX,] ߖ#gXO6cdH`:!3~(.eVץ꺔Y]ˬfʘ=4KPb ѩ,51쭳$Zq27ی jm䑪Hڛ3?%|.u ߯{݌!pHmgybA;T1pPim}oˌ|o3u8ǩ[G2@6IXRXKd>RH+V>OHwV*`18D(U:q EH ۽gc df$^9}Nm.1{EksശQez-e2x&$+)brR` vCJhFj@ ۵zzHsnc(‘rx3*zmzQڠU={u)ly 1`'Udc(C#IYp M˲Hexncp1[] v7g;ϐ#7f1URZgn K*ƀy0z/@*}6ѭ΋ĐC#n?>}?uAuLڶO,{?sq5#׎ rG[_TDGr"K t]ӗ}?9G6B>}?&w?gI}bW!hj9oW؝ްrפaU G(2 5'zm: de`#!!A4[}uHԻj!P?^X{#![՟¦jabi-؅L]cӫo~tTj)|O04_6ȭ:|piqlSPC+OLIk÷)kcW<֡9ܕh!W4QޘaRMYKK1Cz+o%R:VEw"_I]'r SB Z A9*: W}4X9WG5IV%T8y^/;u#z#KVz8Ly65W}l)*NA NAyt s_1 !Sܰ%K;BdRQgtU!H*EZ1|h4FE ^TC3В.td(j #t?oMj.A*^ gEpHlΏ놩V*57 {WF ~gaOȫl%JKR>f}#Ŭa6,Y*VF~qdDfdUb!X *T[5/t_v,2fI:YR!$ig}5s(u]­E n$'˯Xeqۻ B@W\#hN".8U_rE"h'Җ(+xHn i\;i4:V~y&XAfՋ/ff`_̆~ ZRȋثR,+E3ȦHMskr,vB! B#,e^p8QJnB)5)#׿jwqF3yqT^ZA Y `.Vl *;N"MIDQ]zo[ .gXob',xȊ6o]^0G'Ԟ$"Z6 9 r;QW>fxv; A0:?֢@4sOzhHf"]0gx܉'j{Ld-kBJf<}"Pl): -UTr)j/?(BU_,heŶҽ?ٚlm% !˩fw{o^ > !x0"f@$/Ǚ۰mh}tBv؈^~N?oN͖1ΗAAVh+%JSpcx2)*u ctŖ1k!q,eܧ9֝;,뽿K^Q%oƸKl×pR,!Ե'zBtJHgw.I*>}+"Ž0A" d*.9AtОe2ȓ j$hm7E vASY8\I)&2xƚhQx`-,13wkOg?B֫^,^l9.-ç' SaN3TIyOUy<|/JBDU%|za.sG¯7-lOB r6?߲-~ܲ-~}vX nʀ$8n+ei X%ehP~0X;އZ!ao҇j4P?M1{O6So[|F.,6 '$ضc~m1cۺ K"Ұ8%@ lU-($S<\ sM2;P  قvz#aQ񘉆O88h6i6Z ,2!*Z"s` nC!ְTW> 3h\o^+%4z Xs{WMF#DSkR/p|:B+ɕUo_ 1VK3u?IҪW’K~ lIf>,K^(ʂ)X՛MCЪ)/w$3R`~pxZ|Y&g$;srKAė4&qh(.C.uhx:Y00\Q h‡VJ|xO#}噀a67s؆Gox!R]|eƈu- Ki" ‹-P9Hf {: -!+\c7kZd+lUx{qcbn-ŗEVMo}7LVsHV5+D2쟳MQ7q^8Y߭dZxၚxq!Ȁ/o?HtF#u)\?wb4at;DZ o s,Zm-j ?y?wxݎ2eo=iiJQ<3_'i>}[zwiLV_.lK ;qOݛ'E G\D=n)1}q2Cw#ԄK:Nͽv˘3BZO??sT7 kLy`)Iy3y` h.oT!rVo$Y.bW^xյcn׶_ݚD SD:CX '``6Ҋ*2l`%Z"Rn6b`4 j:~1CW9w~-; uq&'ZHMU{ɶ*iÜA8B ˂ !N,}aG O4،\}Mxŷm }A;$l)H̨rQs(V7\͝ȘôՏⱱ=uc;pֺP6fg"ԔR VU0pab$X) 6I)sӝ*r.[ݠQR 綌O#vrMឍo@YGINpNA~.4k zq@ţ`N֋-8)6l$\tn$\#(> uG$)8c([:UlZ4zK?Rszg14Ms\>l=DZ)%[4eYHlE4҂m(GJq;U)Y+D #A$9@,ecNJ3\jm+4mI(-RӳbR%Hxf #LQ\O`Ҕ4V_@bs=s\›4 l@={́:ҫgS V] i/A(ZR/hɴT>BO{$,ј ?]b f<'fXfЬkg9O[a`ũ8 ezz]:n hI":킵^\kpJ50Sp4 !Rr᮴ M`J䈴"'IdvWLeO.Yi|{9_91*k;H؜`[Yꄳ##@j⽨iR,xZq/ϰScVMd_?ږ0M,[kI4_K~e8Kw_Mg^:?Y Hڠ]|0Id## p;x>(w~jǃ *V TA[~7^l7QM4f"unoHJK]Ԭ\|_ܚRbk 4 %i$M{B*jԪ8ǂԟnRsT /\4\Ay(xmFHOZgH t&zR_~.?'$j?EzKn7X/VL\g"v*vQQK=k1S?^f$NKNOiZ?GV|U 1vI@scvi򳧁L𓩗]p=iH=S:$_9e-,82=/؍ʈJ~ 1bOoJ-hE`rHUdRq2j+l"j=cBZ6H;Q/`pYQ _7XTskX,s2bMՖA+[r#Ǯ¨xEa̓ղ<POH%jJn[%$\n5 f^.\.!j O_kQռI9d3 S;[8j|ޜT'2$֧pzsCzwEͅH-3ޔ~rWocr]5XMgT&F\WIWG{ߐ3=_5XA qB.Q޼>ı͘ Qú!hiOQ"dW"}1r@Ib9;E=~ڜ/&zԑ̘ôJY s_).y҉)qmm JHD!ޥHd<ɓ'rD5a)U3z 9ff>!ux1$2P1tes;#~kіѾ?櫭(ʢu#~1ĨD0]]ήJBdj4Z7 Alǽ 1 sV, %ѓx*N'$F7 0EEU"6O__y?y'CuOV+ESDJ)Nrᄒ BiTPҀ*or۾U}/o>_`B'~ꥴ¶'fr(]XFZ KaH" %SrDUTc*NAeJJjZ`C̪)sȬ&@::y⿄&G?^l+r޵.8Ax"?q,BnR0jpaJhK`[&`Da{ѭ{צs$ӈH~#s!}TO1baݜ[ n&Bœo F%{ĕ\ZˣgjI,eV'c:;F2Jl綶K- YQ{,5ğ+%1h8Wej}-"g_K5Bx4pZ㧻)KjQӨyJyF5mT]e,rQռ%X[crs'6k1dȫ3V=E2y-@f[D4:aB֊&^"8"Bp> -U]%RibkmވYiƬ f{u]!Zj=Hb&!-aX0N9JOEH@i!%eNQ"Q0n9De?_ʼ/ޚH=?My&?`=_~2ߜЉ~+!DÜ ~=bpd&ӹ&X  PRB{ͯFjT[WDZUzWwowEv%AT03*nPIʝ3Yw>wU||2 0̓kmד}퇚d#Yϛ;5(ߧyQ]6c"_l׳ 5rH/j\&7|RD+q)Pw9UӯK+(-:iЅat<壋CW`]qJ@ZфuHtJQ\ɶsD]٥mc21qb]y~2=6`K+dlݷrm۰[;`v.Ewʙ`wʶ"SXZwk˥ 3mp˜y!o*6_?$#7}=_QHyj)‡tq+)9'=Jd)ȯ-fK]v( eX].Ko{ܡ'vP0Y%+p KE;V!TLżboۘD$1kDQrVP>>VbWV. scd#Y g 1ڝɃH_j/Oeم*X~QJXuEZsvAp.k6EBs˂^/Nn<2%\- zo>Xev3;5g[f4;W۩oeb~6.W4 ;<܎@X@@?gߎY 4_a}z46?n q6םܚMU,]Gdc(/%hc?p徹g/>v̤Thh1ʿO 1bF_?_wU))7Kh=ՔA\]J-DRQ7|xsJQC C:4@c]f,񶏘X=tuqE\݋$P%veǨ:`X:?ޓiy* w r xEMAoy8f!FW7 ɨwr>;kwus. O4-D@I/ֳv}k^\tu1U ϸl 8_%RrZ:#q" ˑ`.KI(\ZgPKh n۶JΑü}Iխ%̯g4WJ)b kM9 #})1 6N) Ncm0 X$J+磀K$ ӎǩB<&_qR |:#; eGɚooЍޙW8.Tb!'6 b؄sȽJFO,. ڦ _ªtk.U-Q9ӥ߿>F.Pk RTY뺺yJ똘go>x͘{WkMSt8S$Jga;ӐpZ8ͅf\x j%"-&O"'c ׸! (@rB=cY&B"Ӧڷ6B"'BbFDrtTR0w86acH}oI8w/lv,_߿V@ J0eN~t(}D Q:`f?.wL E\®kէ5M޵Fɉ>S5-NC;&Kƥc G=p(2bW:üb s;(48LH${89cЕ^󻊲|'( W)/@n2?_u 7 U,\x< =4ԒB۰"AЇ* ie@g?}o5»yE[}6DX:-`ٻǧo~W7@h!g n \mNedTɗW\xe[,/̵QnQl  hF X~1:Jh("{͏^6e'p[:wJ6J6{F<biGܳ@(\k{iWRcuEʉd*")>̞+S Zg` l}l3f |~3YDU2²?=ͨz$twn.62`\/$ *Eo P;K.q鉃ypғ SJބƺNu#8h)H 3^ !;rhň'_hQcd%Ga?//R2/c ރ3" %Y Tr+YW +@a!v%+p K1:Z (%bPnwemI0LGaC־L1ǾM8ꔵ+RFlAD5`2+/>怠FKTm9;!eE5etjLJ5ǀ0ڵm!.'XVs`L)%* "mu6TqTp`noKce α(pLEA6,-iѪa0\kB XW]5p #CpU}Ԝrg22r0H~ pn F_c@PxT#{6V{nuUr!5(h9&F$௟6zo;|׷7EwFPfr_A(OW&/޽};tZuӠD߅Oc5߮уةۑ@dN͐Lپv[2#gFxdDu3j;5ʜ Y_vJpDUQTž&3;=4cUI B1[x[b䒀'AQYTrIȬBQ&IPl,&][MliݳD׎>Mc4c1ˡI'Y=i5^,\)wv:i2+g1&x ]ѡ5K#]uɸNքM7CC4G;A3+7#Q*aGB۪Ŋ#} [#,"j=󠙋م{cqqKk$ǔOXz[ lٟ`V08dåm`VIg(.3 (9VO**K]!|54 >r Uv(EN_+f_4UJ!J%#h 3Zk.b|N=@ a=@M8l6*J +8mrvGet99 akeQl"//mp' :Yd8q&(؍jL j%XZrohN[ZŪU^` n8%}RYt'vCsiA)(D=AsRXJ&2W(Z$dX)l#g>ۂ,fְX'yd:p_1J0br8SF'1ued]ɋ}gVt 횻p:o{l%C͍Oams/'ޱmmxe->k{JNCoݾ- 㭶xMEv.mv7]ۦamޜY}h5tJ5lqt?V{l=PNuuwW.]}ɉ',z]0KΚ$OzzC=ƊLxO>~c%v9FG콎&LCSrdziֱcOʧNZ>yҷ9nJR7^_=JW:I\4jzW­ˇ?NZ ۵[mUa~⥤/߽/W~٠&Wr_'&FyKs&XV& *ĺZ<7P&t/֛\`5xg ul+j"%r(⍀e ]tًGY9VmRL ?o kv* Z& kZ_Rf1Ҏ SMM*wcFm-wi{/0im8-)A0 YK@t&-ufL@-<ïb8Ƌ3Z@Qc!]wIb1q/leOy7}iwzܾiFms`Zk&?rs7:ȴ~R>_O޾{w&/z֎ Gn;Q߾&dsȸ,,x}vHZ<޻PH68a0t^Pߕژ~W$ Z*W}j ڟ‡xخa0/_tC Q!0N|w`܃^鬽 Kvɚb!'x uȆ3wB¿S8|6o~!{|R4~Efd_/Ѓ#EB{qnFv7+F;7xQq3)B- k`~Fi?%:c%u39=(j>gGf<&÷ "[*Cҕ /?d 4bw)u,뮟rxf)qLuR=,ŘmIbjj705詸<@tMkt7> L J,1VCRNq{xHcv:lJ'0qx@YH{bF1Ҟi-y!?^9R&i՟gY% Kd)wfPk:,J;N=pkv6~jh9yrv?{OȝqA~+-p𻌺 흷?gD_ pr7mɋQ,2.w?_n3ov^1".ne(=% h s׵r jxT)wtVR6k{ec sK'I' '3Hn8!iB>LؽQA<((vN fӑp}l,Lܰ^زlg@ 1rGզ#3,6Ycd*e.γIZvd x&2DSCLrXjELpLghb"\g7(1zQ}=6/@WNx<Ӿz 3׊&07 )ILJERz$X%[%8(aPMgDgL$']($kIqKBhWNj=>rߛwhОg\VW[Jo>^ JBV=`<8rιD]b 2p70P|"6;2y u`W] wއ ~'e"葬ݪs+~e㎛N9W$}LH=bm*P KUr1T!c1 %`~ȏ>Kyfboʽoo?"1J󶷱Tq[ow/_6wվCw'rsb@r>V1P]tZKn+VlxT*WR׵`B1Ɗ!#4AUb1УFZcOy¢/Tղf2̵48E?U (rS3 Tٽ, [V VT MjS| _ahl!B!EwW.]Nii'fOEC oF%wUƆ=~vqD}gZM@VIPSNENH[wλE;~}'귟4 qܵYL|4]t0+fuYZr`j!22sBzCyե\F->+C יtgik] 9B泆w<*hc޻U? 59^cFH|EùL-::ZPLP]%&gFP/)1$5ygk"{9w,Op^ADk>b:~ͥ(23 a3+p-R"wl5US1yYe01H0l3;iINDsN24 5,^K/Sl|#f?1oʟՒƜ'Ki/ARUFsYu|3ż\rívFv9*/|)֋m4}E^$vUsk]UcXUQFE:2{Bh4g6i%"k1C'NY +QmqxeT0&Rs)KVɫpNtO4#>=O>| Ň주{)hQD?6yJ~yBݑ:aW12<6 \%O E9e$v#a\YLknl{ "5aLvGSZMkDyVsBSggSa'/]7v膏ٜueg-<`p;WElAԖܹ)o424AɵμrYpz|~ӏOP R&%s2jaT3EgG":&Nay@`.ɚM(!̣Sk"{@Sڂl㼤b:}r'Y^R9'5ֈi8Heb|)#=瓅-d }'념t_ jY#S75,V^ XФ\e@jf/|k$?VӼ 4/C5PM^MsP7p,\[%AX( mD:b }%R9oϷ%П&BJI^MbkƂmu_5u/[%X1T쎧94BtTV.~=/C2/C:ۂh@214(/fd Px, aHSiȥF%ԗ/RhG]P+8>|uCI1?G\ Pݰl$W嶴1b+UzG"28?K>ynվ:3P깐Nk @_$yQZ[RfY^p$ٟIivnIAss&H0Ǚ$֚Kg=eLam.9`Zf0-jZTi@fIew@ P}K/!/<f1˂\jY(DqcNd%ByNhƑgzQרΡZ9ro8qսyՈ%Al2W9\iT;zufCk1qbU^aS4B\s9l -5g6` 4A |<q\3BG2*s{* .&c:O>mcK=qUl1E7Kv~{MJI=~{Ym|ěh5Pƀޜ1_-+pq^Y{Ыw;Wfqt6]|og3F[ )cc9R Yt#[* ֳ% KbD$2>˨% bU'oK>T1gPiHZka6 c–HQnA_MF&cq,b4q8JT˝͘t4l23+\o4_SadpBBq9ȊA I`v(V?mærŵYd9x2 ~V6pYSoVJ*yB Ȕx535:OIc525~xL>M''Ec $fErBc0;A_Nlf7.. @v  ~ 'a9A p sQ$*ܾ.V2iyw%x- _Iѻl<˂cFAL4UjD>f+%S8i+^8I/]ȇMbd: _Y ,6n{r&LJ "ϰUpȲ[T .5=n dio Wu C UA8Y<EaQC Ͷx;䧧VDQ`z Wj2I-pQcp]9ʩf omy\{|qWv|td|Z[͹asm`Σ܃ZS-C(^PPC"Ĉ5"N-t ByF-?YIJa>Hm"K\i,` Y\u HK7FɖAϡ,0}:5 Gxu[m~5xl-X(0pY5zpfn'cH]A͌48 HH9 :jۻd%{So#&Y}HSwygC^49Ȑ3Ra:ƕd b LXFadEng#Ev"E.p"27?Q ;?B1B@Gu[ɘf 7)ͻ[2u$"E8.}Qj͕7:f~{>| 볋3ڋ/̣DtR| s M k`:L;V`Ǭ)r3 `&pr?Td] l#'}=yv SFuÖ~&naĺ,P1d ;.!ɳKV=>D! DQ+"ɳ{rhLwYZ6n1c,^H|kL wp}E(1fȹ`^":Wp{SSQ5 t{zYבc¨f!^gt=831|S^ߩh5zpj;(՜xfo9d QNCQ CfvDG5hWpA#CD {m B(G=SyN ƬC6RXK 7T3Rj*-^ c7sqюոmfY,l7-GJoA<L>7)Mi ((Q[˒`P-M߷MfC 7QR<ril:*Y胧6aW{F/塄wC1RqF#ź]Mz[+@ؼawsHع0uܟtE4փ.sU6F\l3jظ᛹6SDd3.8W;flk sYXW[7>bAႦi5z4AӨAӊb{1mAPscBK\薅֭QiQGs=|hn)<82 *aٷ5+ DaĄGbω@ctQ V M1\(T8ʑe&EXgj{F_no3`> Y2rl6lHx},%5nΌc]޺*wQ ƘF+0R#v[TbZp?zBs! ӌjǏ:1mooW, .LC/ٻoY4"m;ϫhcdͫ/-*Fƞ泾e{-B|o:r]c^X.'C (O@q fb`-Mп6LRQDn-rWijP4BpW{˭RXi@UJ * [js%ꪾP \޴:id>YC`XJt0_{Ugn ͖P9 ¥ >ضA9( W}mN 5* 7G%&x J/ _MP.A޵߭->…xB:VXx|vz|T^u*|*W2~=l +^ʦ$ ?癖@ݷ)uh-ng82J~ڊAڈ<m;֠bCrUVEhۼHfF^h Jei4+)SZ~l=*y74ӱ\od?$v _es>X9H%&t)`{u2dS@ xAɉu ys>XlqN.;p"8 ssl.]$mP;Z㦠vm=lìYtY象~jyĔ\6=F._~Kap^@J:-N~:Q=DI>*D_ޭq4ѩ§1hCl?Mt~M(48%G]S=x.I0>n A6ғYܕ_=|0<18^ }8Vwl<m(iQT{.&s̐%`y; (5ub[~Ck-4atOPjDRLm=% HnRZߩ|x0KeID\& eQn *d!ft``J5_p?%heԗpqI.}xiRv IL/Ocg]{ƥt$yT2*[^*7,?6O8o s =׍[i XS썕]fIW-[/]^jwdެVwI}:8:_L82]qqNmVU~dj36}/y-ڨ)$n@^Js y+\`P>zTDT&]  *)08(!fcyl" /#C]N@@h(n%O.n}HT@E5i8V~8Gӻ*-cӀ;d:w.ϗ_&7]⢝Y!q+!QF06q\)L(:)A 6z)yTnM>~.h4.7aʤ7;vհf6Log:¯,9}XtRPa󣫴,vK Cih^4iz{,]:;8GT8K  7BEP&s^nnkW5]7IahkJ%4a6C` WH!jʭ<^Ppyp) ~˯ U{tG9/=~݅rLbCsJ+,ba1ڳ)-Y,5J hնJu⭷%7y1sߘΨQJX[M;SKIpG%C37s)l'سFDte >Mc.V7n}xSWN Tgj&9 ə0P`„CE3.TkU.ٷUEOFL.S+ayJ/U&̪NB$}:ٚ="A2Fr;jOܚӯ<\BhR#Dh sLU"QPܒUf2Q7+4TQӻd|%T-] M>* hL!N,vG83 ̀^,<|=iMYvơiN4TcKr7IYڊs+#f.\nl0bMc8]Y5\%JpR)mۡکj7}$uc{Pސ'۳*j(P#v{WVMv3[|v?N&Pm7 %Q]綟}/f򹜶{CU<8ivpy LZ<>Џh6J}xI[Nh'1#ns1QA\( ^~~ >Kc^wE`e"*4Z߃d!"X^HtlCwL{ I; PnC $apy I9r_ *w'%„S;<,Ŕg }mQP8I +=k\xS_e=-k]A֊JK*0Xk.\Zd^V׫JI@ӊ6^s:Xޗ!ܕ3(M `a0oXo'OpKx;V;[Sg/x ˙q+qhK:Ec_vIEFCFk|k!U{QoS7rgv% 9ui%.V)%bhotb֢Xj{!W!0  㞹Md&&y+b3I l#7se/=)FIF}F&iXRcDoݽ*pXtЈs].Y@ 4L[V6p\Ҷώm:9!%61K$pވ |,OZ#tT.ъ;"_њZކnq6̉͡,5祐B{t/\`)m)oM~R2:4aWPi d ! jDT9DNDDݽzQnbnT$PraIBbuggֿrY@vP@fcc@=} ?~@*B89/tƱ8C] MB ׁ*@4ַ򛞞K`49 4'ىvXB AK(=|g PlnK_@* Ͼj(:Vm䖻+J{= -$ qC]{Jo9`?+ yzYZhui6_Tq]5&"8ׅzohOGՈ 5Ӊ󶔾힗MT$(^@ C>o~Y[d _J0" U  F Nk w]}y.eYL2I%qFQLR橍H r {Ƿ` U($*BJ 'hAp +V*{4{*CUR#C2 kySC2T1[rDHK4M*ji'JFX x#&L[FBN+fTA<&lІ]5I/&W:r,%,m{psFCzʳ :'سȊV:V#NrhKR!@ ɶN!pC7$ =SoCe!msoGMu<L̫g.alYփ͕ \1LJR(l p.]nԢ5/Ȫ 9Yt5 B!L+n]Bܾ njNtcp->|jm =VCbhqѭ*wx w{;eyH wUAbߝotwwC -ڝ2YU+a!p2cC0_yrke*$r-D{O4xDV$ZØ`bϢ9 G&'@s}Vfާ` S?~ܥhO~ͰrfA6w~_Y#eejI"#dĉM" Ǵ]ƕb‚Ԭ?ƔdW9:9S,AG,Ny&QbhLҔs s1"'=.fL>Gjp "40knR EjXS&2h4IW=xJ1pncLx-tT4ӝ]Mö@]:81?Ơk{s`oOc(pn׬oqo~G|X_13J)zyQ130 lro)؈?_o Za~i\e2,oܿs|9e|N˷GL0&%WOVo,WkhX;n n=ƟOt7l:ڞ_j-ҎnPȅEnذ1Mc>q̘*g-fide"`yDbVTwqq$*iƱ5-P-`%t2ѩ@{$|bEm+ O32c؂j GFSfX\+62`)#q*H1MTDZ%Y.TK.~cshɏ-wܓ_ȑM"Lijmw* mU7.nQVhc Г=$YXBGh5Ze49U Z=UpoYgkFW/G~Scf㿿ьSm9s]Y`uzقigvz?+=8*$&iz-]wyKe٢P`|e.k1!$eQqݸlf=1$F18]0L01Q࢐R|F8ZK "p(lj&*L$(!Ӎx~y"0^ˤzvyPϿ޻LݿH3J91(n n n0LGEcn p{Avm\ՉT{Qzmşf,a[4[˒ⶼ~~M^c{| PvzD/tGlƝgovIujC5@y+DwO^#1VRmk\צv]] ]Wf%Q(*E[:ڱyAAg۶ISf׳?| 7#o87 !8~n(eix絾-]r/%&aBZӅYuuqWCAtZ%g\s\*@cnC v&RHsBN?O:k 3UX@^\|]=BB1xp 9wa9":@e`vC,ƄwcҬ3&إØ0P,D`5I48eH[x,X&4뾃0*U䈲T =Û4H%m9INqJ%Z8<E39raC0.;N~tQ0L~ \V7\,'w.ٯ=1iƤv$KʺSy].g8^+M} /B ș$N2į y"#SB&3 VғhJHI &I%d3 "4`WKORܔrzFRɁ[躞麞lϏQ KBrK'\kHh`XGIdt!eܖ6|2|yVD(uTEMACxm,B9ײM^ۢY躚g]W󬛭yUaHX)  >FFbel-a()y);+h7./n:Vy25L%ݫ`dR͇{B%g3UTc:K@"uƠ|d@c+%*{]SWWe˕Ғ'"L͘UU>M2 Ma w&'kvJ/;vr >W5Ʉlvaɥw^^|f3nۡff@p-?l +'=#(3͛)U(;G3$ D9_m8ЃQaj.q0KQpA 9Khmn X@{*3R Ȑ@ﱑ fʃO[XGN>vWiRM_~*@)l?>&hʞq1o7Ja>HsЧ8MN)878o\ V4HDpɬ;DɧR0=Ş# %uw ~^(ˁ8 {Ê rt;`Tc}빬LLKÃ$tcɈ1R2< C6a1G,5i!C՚0x`!J'ID:l#|Wa(.G)(*1kh S!wT8LO[2`D)P BʰΛ/-PJ x%=DcnSxx9=Ϊ qK‰ a0S)Ó$ e$@XbY"#c8[nC!wj2 q{>Rc4uwju!;5dO͗y7-x'c(*w3D.?YWu^QyE]u^Zaq2jBc'SIa "0+$)(Q6A|aρ: {^^>^gH!_7.՘WBN"{͑ (#Cb!z"&LA0ծcaBK"*3#)8[b,8g<&X!%ac0<} &qRXw$yFPD23K:%&1(2b 0X[Pd酞8#kN:'6^tNƫWCV`LSg'QnlX{yq e6X%8(gRf;Ԧ,E5=GX@iVJCAcJJqTcKkK>hcgm)VBSqZ,5 ;*?ǟ26m~*Y@r%/R3R;?e*v˵#w4\Y 9n^XCurD5w@ufO͟"mtM] g7$DS-?(x\-Z ^/ϛ_A/1mOmu,PMq7ȶ ю\vm}b뮺ucaQ͓Gf=^HrӘ[V>] Z-xjm ?5GF{I)źnϦc\p]AѕPU_lI(I87UsRS7 \Hˋ\rHF|VF\2%B,- ha+$* 0(`M?PQZ:sa kԥR@L!HX'6b$0hfಘ팴BRe*KR:/dڽ$iC bQ'Vr| \Sb?zkTmոxM X87eqt]TcKh|)7T5'U 1Z.|Q+n/1.8Iߵ۶[͚[ dw>B6펑G[4@hpZ!0׀=ٚ`{ N'v4fNY? fl{?_M yqP40 Y3q;sJUT6_κ)dd0OuQlGpnnrl9x҈zJ6LbڼtIþY ],\nu|Dǎ)ks[籝e(5txA`+{ߋ쵙|Gs NF^"oTQ4?4F'R aQ iTĤdW*FΔn|ؙD?k߽_#$!q]͟Ч|*O4nڽ5s3;Y|goݟ }vfunbٹ';L;fwvoʕ(_Injǽ-,/}}Eow/Oӝhjg/!=r?{??~~߻/#ūw7JӻMV߽;en.>O?ƙM3i⫻ {u4Y%W͞?|/r/d1ߌw& ǞpLq.!A+(1IƑtsF܏ W>? [P*ʜnxp/NEҞ_NOsOai]l~S L/f2=>/rCZ:}?m>̛%ܗL6$V$j_ϱoGEzD Rgѣغ^cK*P_5o'Gk?AAeX3MZ0~PH^D(hN?ń,@?SeJ F_oN[A?aU154-jcujhO#s=~ RNJ`,Cni#O_͇qL?OH͝I(p 'f<.e+8OKw <zʒD8'I YKm,e sM`e=_?SʵP '@h p'x9~cv([/.@UGtWU{wegP:2`e2[5S"=R ;`oEQUU=P675mx* M˿W &hJNs_֗=̌ 8fg1 skCf ,_ C#KZԚyYO% SIsOI?Hոt:h2şu;v~n;=X$ևS gw9):62?+~D3!Iݕx䮟V՞'5d>`.OasC4("Ԓs"O|벌whPtŽ0{Oo. ^u01DŽ1)רGO z\Gڣ,*1q_jTJ}Y"?N'= ^_뫟_v?t߿mw.pjrۯݫ|wx|zU{\^"MnȊJ^ק__b$2OV1<k= r߹r>f>)#"?SP74r0ACw\cDQ kt cpҿ)Ŧ?,YWb̬ðVII"}٨su&p *'N$bb}E.$6Xn1yG O`ɸZ}U0Qp~< !T]d|>&j,,ִFs#%8/FsjwB5H)x2@ܐ0[EO4l3ụ&R6yt'#W0īp| TRk9} @ GWp9G 8Q'KF!Q[R[fohޤFn6SQam=lgr; 9MʣnN&b+=`CœzZAeMGlMJ&xTR]^Tf~ɨ.hX=j8 <NW2o2+L6v=ɪWּ Ak#fҮ>ڏfY6tB-1~| b0<bżl6ðs<h̥ MclqfHӱ3 y. ]Ua}%->\![23r@DWXCոVr@9FJ(V^F9VKVږq9͏@Ҫ FH\N~vl)ߒ,nP#Tj+ZJlp `,lĢ34Օb1%aU%-;F2 #ac'%KNbvCi%הZn4T"KY]T2b`jLw][Rm\Q2fnø\?{̤qx QY>8yJV=-Ś-cۼn9Sɰ2S<01P3N2ywA@\x! Rr-\W(=^՞[+$XHN1`R" 1'ET검c1wYQ~F|B"q("IEp%)\t B bϋ_Hsw*iArzS`L AM2]2qi 8#4ӚMi-&Z>FG5_Oo[d`0[plI+{_&&rFVAT}R7U8pq8FxZm=!STiZZ"j* OQgQ$%lazRB& s'^_=gH8PXʈj{?7֤!}%g#O{8+q/]d +"6܆Kɒ]>/3B _/jSL[YMnJF6|;_.D9MHsr1)h;ӽɝ)x./ /o``Λ.Y PY#-Fp2ޱ)./ )cz&*-No9V&*aIeCZE^T[Hy-Af2b2kk8U[[Fm;lblK+e4cě<3˳&"NԀZFuHh .ByJ4lWLͦctlq"RoXq,!_/%;w;놼jnzJ(I# tU<;ɸ;0 ;a?g.ٱE\m_kNR>HI*fB5 Ov._NNzX^ڳ݈Cdk$fPjA,iAG:l}Ԟ_KKtȚ%ԞLg~0ߛd%GquAo*tTJ:ھAҔxO#^{gVoc:~1,>&}80ΦB\2ڭ n;R{kzAKM~К`4>1v A^owߏ46[-uvR섀e:8o_ve'| zjޘ-17:Er?/^\\3xrq}o:7ō|"Dr@HIA>pHD=!up_^f$,)-_jM3]UW*-­+!$D$yR@r 0/+oڈ[F燺4 hD@${,`^(%O#F=ωbBF I.M㕰-\Vӂ/ 'ԣd,r}#*"`evL {QC2 *t!Q5P#j~5kGP*Md:[+⯣Nc ;(Ep5O:/X}SpL&/"?Hx_T ϝx}[Jyq옷 7QEQg~:www755! PAIcrpRpTF|u8%"NeaoHe=N!Nc\Fd׬>UviVJRk /ˊk-\0Ԙر3sF MQҾ|տ<#FbJyn+ ?~3]zDISzeM?߮sRmDy!JM9cUD;:xNA?KPkڙ|gPs> j1|$Tr Kr- {wv(!~ĩx{e/ Z-!,7 'zUC4t!O! ==l>|^a.&ة] EwP3I@ gR~G:gN$qސ%~TO?Fd䉒T}o&Ԑ PeY_xuрnَCVv; U73IEng>cS2gĖo!ATYၓOHZs/@(v؇8c=X ''k"b3f'jpٍVct%ZG5\.M̅7|! ))2NKO̊t6)raPD[vi7L#Rb7\t Xʛvz(d ǜ1 T "9p+}z$ڲPMp[  0"x9\4*収,*=nD֣O~VEGx2&If fmG֎ ,yc* M >3U@T<ZIpQ~V)ֺ<;3ct-`5VR,H,S1xc)uEOr ю :EZR.h^;h2j;:T ©w](鲗qL]$8dgi"-'OPwK8Q}\=eߖE\ya'1%5ɵ<4?S3ϝ>>?{DlswvֻR|`fbc1;z.%$%7%m=L-@̀:TVzI_UOƳ7ÓwOwr/x'ү~k~o?7_<ˏvˇ=_gi6E5_zޞ||28L/=L%Itwf&N%Acw]4 թ jk.|O~RF羾?O2ܫNօ?}~Dd.-"Ĉ2\U>_5q+B5o,;G a#iJTW؇p2/w(E,,AV۔ygB& ]-NE^vrfpHIdzHlaNk{YaKA^?z2tʪ %t <#u%'‡p2zƧKd%`-|Lʙ!OCo!&H5S T[n||FNOǃwdFfe|dpvhT7KWB0exl8.V_nTdfÓOȨ^Ygxx>1 5Kj?&M) #JW,t8ntgw2IGij/^6Emm֝vߨk[S3K8;ߛ],'p>jVdyk1OWdhcr2wEUg 'c""1WiWi9JÔZ’b=|p gլ#{^E"U~_spʼ٦.oJq2ƄO58]Z:$3*3&e`2N` |T)IU0ZAj-f{$[ GiΗ\FIr7Gx]1O؋~ 9]% }8\69οݥZI·\|m䰃#׍i 3eޠӞ$GR Tޓ 39hFsp3&8 iH܇,dRCV[M"FAC`^Q1d8|a,RlFEDE-1GVo_Vg+""`Zu aߩג^i0oR$siTZk#WdI*F̀뚰gߵW1LilubA 9Fd)&p6eLxXr]S֢Qd]sRrQcE6W2 1YFbgam V+29YHd%Yy>s-YIĩk#J3՜uCNJC.qKh!SG_;[1r+iV&F2D B>F0R9]{-U Zݡ5:q]WoK)6'/qjB[ctj\29`=j5|ـA&s5a) M؎ 8KJq9&0,\]Y@z KwJLmǠw\:t\Aq,PK;nX!(d6p\,e|6dl0SlVRO.^~ /l :GT?{18-#uZܡd:r#{ţ՗oNW;qOu~n,Qڠx7*[U)+)q&f}G3w]J-F=^V8e'1~@_I9r.-g5̹7| 4iꋌɴo䑲FE|nS)%~>LóTp:!6omD|$"QFT5:,¬E?CsAHʓK{IQ}eqx\=7=s&y$"Qd"ˆhY124UK.j0kxZR٘v>gdg|r>MCGnCo2nSM_Gf??~c(Gb౏IKCK9XBS?6Yk/8S`YZJ"HCIį2Q&B@[GuҏD_7Ο]9vNrf["`rcX`pEFM1:QCF̈́d#l$]&"1r9݉TDEvItYH$KJά*gR\d >~׵v]ގnB"%CI$BQXLXUCJ!SGK1Ч0'rBȵKX֦h̜%z.}f"ۓ;.]PK{PހpgJv <8V!U#KR@1Nh['֖if/#Ch9F-. og =^2KbV\q]Uq}Vo\ lDXb1"{ADZM&PA!Ö2@9uCNE9a9qm?2P!ඖ %lJvr{@=f,uV |@7 Zv% <̉r\»6Scu8+%w BPyvcR/YKC\za m["i#q7% )gȼKp13Pѕ_,1=.HCrqBdԥ y"}EG*ejRG[OCqF! (RDH2kM/BS+4!N:gdzH1=H–&5SgzmgȖPk}i)-UToL-kE5ܜq⊍a[tv+c+/K `y-3XPRtW-*׿sSbAKRG,Alayo+b]$,|Ž tE{ s-E_.PB=?g&HX s+q`Hj.sk+ן+&]}֪^Z(/A~RoFi w} ~WEl,-Xx*Fۑ"[UgK]/Ys{+c5.D_ {>~K}p}3k[v*FJ|EBFbaܾۣ#)GCvhP9~Ȑ߂ex'~SѬ+GֺQc;_x$vT3 o O[~%ʏRf:!FJH$ S BP\ZG͕)Zd`e,0ǃ$.s삜a; j3MgݲIMbYDfMM[,wxd a]"i4b eZ-C^,D\X勨c*r*wT. {obO E uF]]/˜m9Fd ( e_T"`0>0U$C nRm֯&eưMk 0Rn)d售5;(ɠhmwWf`>ٓ`Yx0c'5#QcIG2ƶFlv>VսUNmp yCfksjEsH ,(> .LduD{ @^YBp qɄX3ocuQic++ܔa!+IJ@V&:S8P,bYPc;N6Nl\,cip&_Sl}o&.0rPu[&4L3040Ԥnݶj E`t3 9)м;=Dit@!ߦXw.Kf ę%iqIm!)j&R_98FfQOmnD)y%pnsHP8_NA&?{6&_z ^oEf(gx>?-o:@j.ѭ nL.j. [ \XPc li\c6kGyclq>}]:ȸWM %wD+X! I3EU7x^9>^"P_0 )CGn4Zlw")bI&7^5rABPՖ*.P$v8T[I`z,.;/lsm-%.IbiKŠoM)54F(ꝺf[շKʡoWdб%qVkxq:o8Ѥ/ػ3I(1L"{|= V4ȅ`i9 /6FIћ%h8sL)H7RcI7 <jQh ~h4rB{M|_X֐ !MBu>cK$oϋg0MȺw}Ƙ,٠VOIؓ.yF!gBD- ὰj#9đ@?l+msr}ViWIjDЈNMdQTb}bRwL5*T!TV!4,exML9[ IҝtOtI N{g964v΄KXL$!̸x,6AYMpR!Ry}m:a@{$U?2v@=Z!Qa>.Sg](J e)w?X,d<4"6I"+''[Tى_DE;%0Eu2thZGVV:@N&qR}>CB(%Qrl]U[Mcw=H|WQ HbbJ"N &tu4`*{#t=Ix^Lq2nё&[zćhM-N!̥%-q>},9Ҡe/ty#%R1/s5@֣q6q 60Y zfK۫4cXm/R/ww|Z^:rSfD)&4n=!%gŘ gE};p!,ـ׆oNoL*Z)wXlu~r%3iA}CN0y𛍿 q~&[5G&;OrGɁGM_o+TZbb ʄy8U6g-bjo[a;یl=&9Υ^pa y8TG<riDW7wx4 ZdcH h/7tt΀*~> Gׂ ;$Nm̯ |rB@!΍B-o/^ORBL~ 6tDm;܀ 1`=T9<^-a3rEF%2jq']<")'Nhegrpv&אrfZHJ1h4^RCb}Ip7$^4$:60Bnxi e=),Mx}\~|%.Τ۶+tJ0DP.-AzR4^-K%Ə~qT4^d3!e<6\cy*2їkט ^Tjgw[:„"bl1}rݪ!׼GP˜O~&Z<7g:hL1ftQ5?E='xdz

&_#0.zgu/j$edguL4ZU-?9D˹B.-?!#rUJ4H`Pbf9 mR!H'!w^':>8ul1aJ<5&LG!:Lcۃ<ӷIgO h:2|ȿ~ [~|yr^,{2 #{>NۆǻVMZuWè޴Ϡ7Ѷ!L$ф}P}B^_6:>q$5*(&6u~9k>]1yysse1\rqvPZJe3 'Mh_Q%#qZ*J6'm&-#0M̪g-//%mqd=IDpMH.ɔ: 5:t00%t,#ƅ\ KwƋ6Kwƈn-ۻ̿?:wh=kf z^I556kjVwcPzr!g^k#d(9VqBf^YX3+5͍\?|^ lqJuny8bAۘ?_@813sLrbAf [6X툢ZҾ5N > azdUHbv-u(q)u}\q)t˯zukN-xrȼ4cZ,&d5B(ϱ yc0tC+-Cs[Psh߲vBsܖ$kX, 惲#VbYi@w9=W оM,"2f ([d)Ʌ3Bǡs$3p:la,GI 7m cN.)qmRQ(_ۭ@м,!Ij+Иu0 V:1&uT+n%|A+| x21v!u=/'*p颱|k(`(\_|A?qΖ?|}C4YDM' ?~=W B |şB߹ϟYi~qէ-A a=FBVBHNo䛮ܿFX F(9)X\)/51!9e)MavYsʴt&WA!pq*GX"hRh5@#`Z2tzaJ鈣6d#1L)w H4BCkNJ3 tꊘ@k uJ8֭2<s>jԗė`!HIՊC&,!3^gorH Nz5;, N1kCX9Ń|B~!QvX;+Y B  p-(4#Ył@uOogʡm) ԃ) r'oՔRce-XsZq#*`5垂N7J=䭚Br]7ϭ5DR%*wqL!p*IU^ETHUCT1{'ac)SJ 0|0U3IS`XB4d@+*dˆɤFCrH=rzaA A+\PaRxRX F!Rb*{7 )$?"czɑ_12ŦhtE7^i˖ܒ\ׯߠ#%$PJ·q3`D0hjrz׶Hܴf\\ R> 0Eg氹jR>9AOf<-6DA(.9aO旅i5[Lm1u$8s>xGӪf3G_LRӳ_>fW#E#ŒV8OPqU^fT箐6_Vy2Lb12Ȕ@"KO6 8= x/a~|0PInLy^MnFowqEY U@U l/ZHm^/DP˸iLal385@ Hc#;`BwE.\oX0EI,T2@x7oWBZ<χ8.XKYuyӯ)&bL &AX~rPt9(éEb4U}Fk*7K>9eB"r F,[O,n;m|sKdB͔ \xu&zoGZE0tjߴ7K`KCBr%Bp,i  ˝J&s,e~_ Y d,^'oB !j׹&\bߦw􂰕"K3 -~& yx K߾૏ "N/&6Z٫ׯ u䔸]9mJb6}.,nk Zjjjڠ|I1qX g0qnI-AʮD«&\gkcnoȪXsZ`,d/^R<%SV1YR(/7gX Q=|e'qHVjK`0ʘ:3܄,'SjgY{Z+QZ#K{+R"=,xI`fVJs!2Xi*c kQvsԦ]S#]ST?H'ͻ-8Rcx쓨8'{!rE_\E7oyylauFT]zufnryxhCRbD$a:Q8ڒIr[y5Xƪ>,V!"ClPJ 5)k}m2Xj`a3lp9j=9wM OxF)SP"5{cu!ZrwU6ۄєGlslZ,!جBSMhVfe& !42PBY8)^~&w(ym&@S8Zkž0;tj*7gMZ,'<0 l x$Aоo:0RXJN ki+/!b-N>E=@&$.`Z/2R!`C( DR*(})EBWh |Ͷ8ogĂC eAZ!bavpX^qdmS7}=N[.u P3b'. _CvO~XJew#"H7<Eݬ82ZBpҵyIʌpVBѐ_nސݽw;C*-6'OM ęh%22hrix6AL,0[s7='N$cQL~R(÷S:3LPUVQR50$)ԅJd(҈FJUj΃Z\(-9F—R꯻tK -8yN 3ѡ0"1 7kKF:(5t(D{sY{q PQ/=)%/ٜoX׫boҼ] aYj%!ⅸI%mKE!%q1Fn!>#Y9KNj|4nòCvl)'< e3D39#a 4YEHS'mWNYms6ܛ\szy?O|x@ iC}|V_x>^վ[!WJ]R(0*BRyrC RO.L-/G9ss3jI˰l_ڇG{7rdHB+fN48{Q:T_ޮѪ(%j 䙛 $h_3dzB.V `'_{e3?>$fmU@&

`0pk^rg1ʫ%^zA+zhqjM݇|{"J-7S^;|%u(DL]fn]Vz(t}#X<42XϞ-]I{$Lls;VP(]._{̰+NJʤ؆N]ZEnk'3o߳oaj} ;d\Mհ  ]ƵL$ ` v9#bST]|9Co;^kwL=8VH 9 .F O=wH`a͈{8!%ќd0 X(Bd* Tx4895Iֱvl6CP) $brtpK= n<,pdiɹ؈ G(*Sy"#,qH9hi|Ǹ\ {ؑ XIiK58 R~ՁOb]ӉALJOn}vU>۶էO_~x7ȇj!'ABbgZ"bO2_Ň9`13mdKF k>IdrlV[mHb=BP4w.a訰m o/q]!Pa: R3;pAPV6oK*ePK`yoB(;AE4;B= [ݫEpjР8s䮭$9;+|t!+? ,i EȲ*BKX%o }-_!y}\B{n,7T 튇Cs拊]9WxH*oҦʛZ@l )2=ZKre?HybXhQS0B tv]`U} (BQ4TH$C4 pǍ6 Q領V{tQ6Uz(OKlDkGcLVBXV8^: hs<)z:{ ;k{B~/J -7({r-Ecq)dKFo'uZh T.c훱{qO.B KmSiǥOWzFNrm<96?$^ӯogO6qZJ;&wSz/O{)!|iPÒѬ90{6qͰ[pYME,ONg'Qm1-CnA\ nۗIub֧KX0lاpw%|Hi xּZlǓȺNl][·:b6<[\)>麁._s__ARaZ.77-3-G$x웋||,]}ܹ ]{,YH*j%vp[hHק2]>]leZchA:Ά.L7V 5d( 衡!H3;`glÜ * _ۄtGEd~K xZtRܶAF?[][>iB]^kp~t16ǥř+hw?<۱kGZm}4QF>3sGhc3%J֡fV*"JFrrTKZR8 ZF" Ё21ƌ⌋p-$[˃CQ?g1ij]EK~P\z/:4e_dԔqR7/(J 꿘˃D{#z)7eU[b͙#5}\SCpho LK_^}novø8v7/ |Ì#/ӝͫzS! @ݚ߼}TPJ|E: hUtӺI.)o.SK#0_%%#FՈ(+#jcv%Rn}a.90IC(%We.[uG-5VKeXEH{tSAIN(/tč߳lzګC*kƚfdn.g-%EJ]8"eA9bs_/.g_gqv~?>\!hIxuhl"|'C \'N,p]fn:6?0ZO T2xn߮>~B>/ڟ}.^4C`r &#x~mT'րUQ0T)<'f(ōP7)RQa}Du 2-%FFS%yO"!҅`(#drF$t[RZQژYf5%4JZJ*iU_-w9%"@b^EU±jPP"s7e{[.dYtrθ"GX?\tsy/  #^~EoŜMج3'?\H i/ tI1El%sb<Ty(V45Džyиy 0]u 0]u9T'932 fЄ<28Ɛ; GCI[ih QlMtF0"V1erPA~p9٪UC1ywq^ގj3Y7D/Ws_4xyʙ<$ر2סUNGjE~EDdѠ"Du跓^g8 RzdpG Q04^ljBL5aK5LY;%@9O׀1=׿O~ټ@2+(@F&GenQx%R\ tu>ڛv=Uq<;"OE;Y nmRHUgaʠHDaG+40~J'RFiK鐆I/9pQ2&PQ Dh幆4BZYxQT3h͉n4ה5[d;| ѰpAF萣݂?$Okö#)_N2I  ?8FO^^[5VD1"aqY'~?|(0ba"+4e(D:rFI ͭRJE%6 I0./$3(aOr(]u{|A\-%ΚwډLodc^Cc wC?!π*%Z:(Ggb>ë 65+%@o (t0添ΗYאk?" >"m=gn:~}-~װ/m!@k̇-n gfwY|j6^5zxWŬy~ŌV3u#"mfEUIDW(],]F= "%#{r l##irӹZ~tOHjGUM!EjGT2`\E0Lu(,5 YM.H%jY*П|N˧ (A{p" cJ!dHVqPخ& [BE"xUh%#14?Jo,:+뀋>~ ,43"ܢ< ȸ ZMd"(Ex<Ҋ>gXM"R+E*&k4p*M}.iIM#ܼSoT?ݹY43*,58tGÃY]>ݹiPSPJڦh󷝴xYר|$Ԝ舄F< aw{$Wnu;IyzX} 鹭өN1N݉SI*=OPvg:WOA3:fLHQJ"n/t8WUC(UxDžx.0ZȊ"b T nsVDni^C ,@gfc>C.lQ'SnC =Y,>Xò-R'ɽ_L9u8ՇeE3 qҜV18-f"UL kI)m ;<$U (ݶ  .: 7GW^Il Z{u)%@wFObyvV1wSw[WgzѾ{*j>aZvR'f{Xr4.PvF X3$ѺT5[99EoR쾡^C1;CDCH,sAh*&U. (cjNO0RhO1\,Wboz{U\ry o:B8Vѝ< m^̲MB'=]lTVquW*_}zsu]ڔvI6]YoDz+_E {#FN^lZmqgߏ$Dk]K@Ecx{趷ȿEo:?7󦇱y?Fb >lh,".S?dc&fs3ǘS't_nGnR~ &YT$8s=Rvm™o{eQy)k+WCE2y1+sH?iy$䕋hL%ujk7GnT1nOĭiH[E4KOS>k] 8{!'u4'Va3QS-yPӃ:A RYF !d׏$ʊ$3ba 1ThvHI ճS<#VV)f E]j)EJVt## $nd.Uq-tCLjK6m2h2!!\Dd*INnX.h\ Nhc`Lk-{U[E`&Nb5EtToGePmˠEK߹t ˜0_|_aF ,$7 NB OT@E ڨEIIn2N6~o4?AЏN.!8C3HمdGف澷%3>0݅gP*RnI ޳viEf}?mC]0/IpeQ4RG:}o4)/GUཚ\?pCÃJ 狱s&5OZU̧&_ 5L> k1K RSY (b|-*i}rwTN@$\ =Z)57XJ/P!gZ o {l6<* jX-6BBHD!v,cSnz8 >ָ`?+|ˀPH z84cšPV*v09 <V#TsAK i7FAX-%Ziaqy A8b)eAŐ0wbpؚD;f_$wP ^~IICy{J.HT[v,Kߕ [^*ArCD@MNǭb l/[7,ȸ\veEw"0m'ze%wem(&mCFƐvSS*s{1x6f4/22F!NbGݼJ>*AXcx܊1,24|)nY)ՕޕR6SYՏTHއy-"{fWB_R.m]` =quW:E3H4Jxc$z#S ;2$Zr͒))^" 6@-9нRO-4R]gƂ=DrAӭr;u`$Ɏird*JnSϫ&~|ƾD""Q%Z+X(LY@5Pm7#Ylsm 4j`֘_e?{xx`>sk!*IE)DRx pBry`QV.D,(%X 01FgURMao>|wkHEpWiB@ HO԰|69 x#.<'B4L&Sc]. :g^!YHcp.0Y"1RۍZQy `ct:qNDǞwA;?iC[UWBOjr%7P:4d/ NAgՔ=+԰03SՏ+cn3"|P4!MEyL )h*dX5#u-"،4ͻ-4l%ꌒ΄~kg^t]-%o Ujc1^ )5we d_tjuЇk{ꭵBл ]N{@H6#{b%`~@on'ƈD:CY#AMjt,]ɬVKL4prhc@(eLZ)v8h/z'ǍHu9Ē`F+AxqX[lGq0`dcypeܹCi ( (́ܢEƮj -SN`,' mE `b(1 `B L>w.1CdQ!+3D %:b/z8OZ53p]EYb9N*Cط:(ָqWMM}vwooy4.I0BQcuL6Fi QAn i`*A"24g3@+@8p Jx_|ޗ:y{hW)wyoZ#`kђ{zօx`_Xq3XQl#Y!}0t h(]R*XB1hȖ&i]7Mgݴwa g,XuCx~pzcnC?!CRR/Ĺ: s<1?s-wN^l:({7aKcb闌,|˞sTn Z)]wEha')i`ARHTLx R841ׯ~7cNC\ ;N,Pc(R 秢WxUqR*;(kn}~3c!a\+v]$-5PTVZP*Bee:z0 iad[v2b#*rTw` [;9`Lܠ(!##ԼK2mtm |wO+7mg{5gGnһM{`2m2o:KW7'"UrY{VOfk8=m[?6ؖ7 ^E <@H Y»-09j84p<q]ES%DYE^i %R[* )dHE} sBP*[͇AFZG; e;鉰h-BB356FPVQXZ[*w|] R3PR%c %"%G%St$H/{fwɍKWcN2}KH JY TF KdLZƾ SO A }J/U&5y.G' 0~}MU9 ڍC9yQ950vhhaXh>3eB…4dP ,HX5uOl暭9^` r .ā gCP ,^k*!&"^hF*a/0HCL;aQin?m<`BA sF1'4r`s\p Nj]?pcSztcč#x|ߛynd\1M >lh2wy!Dҏi.c)/_ y*2a]\gMϾ:?2'8ĥO=??iy]ME4G0ƝɛM$Hr1":cݎ5H+j:$䕋2%ߗӛ/3gwľz'n 4 0/|I8pf E@EwmmKz9#b}9 r٧]}P-Rr5Ë\8mT]]U]])ӉZuOff9gi)Ap<t)f<*\ZiRnޞdE(ͱx(PW/+pwG0"0&MD'<)SL 7g@S`- ZyDkИ:a NaB@n&Z]𘄈уAp;VO& r;W f\I~I\#89"bYAHAH~aM"NWvoyADdd=}y6_KY^!,wO I}i//BN`m9#h0*c,NJN|@j4&F57ZҁIcrXِ6I i Fc)oOMHEe,ˤ<0`5)8}&X^F"Ʃ)d"25Nn4띇[?ļ52O W.ZJ$\Aث9P3tOdŽ+$[Oѫ.8RmFhvhvHL*j;蟗Qy?g\7-:wtc9NUOgp:Az]L`0(a;f>gg^O k/}Dž3tIqd;qvIe( 6jO^ܡ)ꀝ4/G 60gR(%c{&^@mpI/4Uay;TjʄyF:Ԃ@Ve)e"p AeB+&wdv{b/հj4/f8-4oXd\LOM\Td>E2ߢ6bҜ<&ҿF(Ylq G r+a"#5Fk)430BXބdDb#b`7Nbyl( "PMII 4Q0#)Z}NQLH&ʪRj&k&/ <},h: _"<˜௶՗]Ȃ O 82ࣷR)gWrHK5{5@fQ-,4RLD;) kVx?ګւ"݌}cj'uT-j ]y`!u̿! B:_d)'W[+u +9n5k_`T/dnxP]PzUrWl[0I&ok|+wLRŲyvz˩6ٯ`~"݈\#yc~Oz߬px%$ߓ ~8f׮?g;L9PɔuӗfbtSԉq<ec68&*Tqs#|| ozYŏ>0Olp?y̡ KZf^ V*ֈx#(ݩ9:(gEhĊRJ4+XQ 8( 5bF<úh>Rҟ9B9C!DD2Rʮx,gVQ8Evө-N1h?*>C3x%w)-bZ]d{yqFSr>IHhxg>L6ӄQ<4!I)Ke.I%K:4#:59FbQjOa}s+P oI`0kA (YQt(f$yG N2Chp,/ 0y(YFU@TaA" X#">0rD e=L1{[=ۉCp)"됭B6}b<GsDm~v`%R^ruQ>Ekaʙ8mIAҴFCV [{2ӯV&>s^fǎfA#XHX(zEEZvY*]V;=c(zͳ<1  j41#ᅬ&1Rxt_}Ӿ&>ײ#c [HAd`0i!k^.ᐤyFn;='ԸSp$j)YsbxBMfJa`HKRLAWCQ)I}Ջ@WZa9P^>==ZbZ/1'H;Ϳ5Y^Y e&dSE͙zxg)w^)}Z&aQ>VvXxοf9 EXcRb: =0I9D1Q>Dl1Ë NnNB?2:i9qKvxi$5n,CNwNo=ě[fDuʑh;? 8JIΒ#ZʤlXP>%C(4P:S\rLݳxkSLxsL:t"t:t[?!ab K=_Z4HFDjq1siN8fAH1}fY,(ςa˺"ynu!f kFotGPJJDP KPpV#] 0ft17+F#c "y`h椃I iĊ=,D$I<)C2|AE>Z;A걫لX}ֲ^]EL5$WyX2,^aڽ ZTˮw^ͧ}3+WV 'ַ AH:WU|%v * hֲ{?ׂ!q9Y<1-02܉yq0}^s>eQ 1o){5w{|]~tn&\vdh)=4bӭj$鳜{HhW!:B!,xb5ej-)3Zϱ5j;4ŔR(0cRyxyݭ̏v:Ư(V Ը+!γ_6HEUh/a2w 4Oa"o߁Rw?;Xmf`8}6b%*:F:H{ҩઓkAt!'f^]R8k2j5b:}Sto-0* vs:9W @3]NSЇYaKEq7W+d 4"'(cxS\_&YuiRXR{ew>!Ǭ͕Rgs?m}ἑHcoOR!.B⶿R.^B勑,l0I(fK*Wh0ߚO $ 苮& ;bԨLbծH@_FDΌJ_-sFU˱qWtuDۨ_UckJvPu@PWճ$qU*N^qBЎ&Cj<G.+tQoK |f?mayJM飤Ҧ526d]؇vCDZ&wIjP.;- JWUN֓} % i,>c~Qw]m岸mC`Y6[U]xaZ A|HIɇ4)'X.<N[A[z$g02-˸dZv[WgP6}QTjFN"t)}Im=Wh P.]f`{w?nH_e1ߏ"vno|63J4D;vȖF[ndŮWVbX\Kd)879j\b+4G<`e 2lnߑz{hjU0d@PK˙@d8FЅq\.P獢z4:xkOAל@Q#'b?> :f 9†\3/c$0w`fCM5)fRHURO!qJ#D6GO0 +Y8!&-',O*WR9Ucn!@ֶ;v=vh蚅0nKumͬ (p ݻ*[DdL:׈-m ˕Zu]k|JaEuk;@O[d2@J}eK }!. ?tet*wt Ł߮?fͭb8K??ބћ2zBFo!7x4ɤQN 瘗sÙ`CR9P.FCˠ#bN;n0t:)"3}@׃E+bGb4cwvۑ04Ƀ.\H v# *ОG;C{hXaH)Jclm[ xtq4aB o(AO]_U~* W>UE6 jK9P| Crէ1 199fHA(qicF(h^+E 'T;cRl8- OWkEM~Dͫ?]WG_M_Mܔ}=7Y~X#_!7OltD~ACoGjmfS<WKw/ t Z1O&dTr||kQ8%dc!LaK-~ f.#.%ĝGZ8i8& -wn] 1<ɍ^]yb&~|f3?"Xz_8n:Lm{  ,_jN)Di=gWj!M9a蘫н?} PO'#Η:6 q̫Lj5)Ļ:1ȕb^]#%<#n'DF' !mP{S"{ר]"~S$|şwbFX_e~j p)}ECX(p> 6L_vq \5sCl*35ņšBZQ1jC47̿́3?bdɪ&d B;߾]j8_hT bD'u:|Siºڭ y"$STľFh7y/lT bD'u:|δ[zCvkBB^&ԩڭûm톨,vK jcLv?pncm$䅋2!H@h/7k"%_G;X/#5x/[` /vhWі\>u*ԗ{S8ZD%4K\P Fb JQ͠z<6 6 ¤y)[ Yې^2c)ѩ@.9禯b|gh~i:w,FsJe!һμ=a;=JeM+-&8cH)#&3SJ8˹d$ SL(0qvV]-F[~1<, 7vQ$gׯ_ж_+H 6jvOԼVkoǯsl)-Mh|9~W5R[N*dA :)Lg&[&! (C;?S Zs99 5 ]PDvjţEu~qn,r H{V$'Z+"h3SA09E@!R9J)$-.L7pwΤƬY #eLhh/n |us8R.߹2z3$[3!˄4H$%+1v3a&5Kz&uy;]E'B=' {\\- do2*)i7Z->KZ3kYt*J CXJj .-C~O"C廾˹"3r*.[Tas8ak EPPd@)n3,Θ8gIb= <0 t/q3 xϤ5U1_|a*{xq>!E{s}yPG <-З 22 I 1YBN%a& ֪6!C܎M9As;rn)*z1v6/jCXR/ cl^!㑼W2Ho3k98ZL024ˈLC1PR Nnj zRpAS@`M="\OQ;%d'H֜>X ;,CF꟯;SG.W )N7ջ _k}yǧL;FcB8c pI8 `7ÃUҐut Dǎʽ7dod!#qH3Gϐ'R۽FWQ/_k@)h zr51}X䣡棙?vOwݽFXGפW\u^"ڳ!#phذ-xiTu4\haPϤ1K*ת~zC1?XdFWFމ/q{WoC_ -n3~ ͏jO&q$/e%.>Π0&xb [awj~9 HD6XZsA-F=2EiVɯOwd9*=VrkGc}x9 6l}?YɲǿwγǁiA$>qhTɬ=vη=4tBAX30m3x}ֶce;vη f ɧ|x< dKZ18qҝ@p$_6 ]\?*"^ L+6Ozo^ :$kQyˀ=o~D)s̪E$7 ;ّNŸ-Ctzj-Bk1({" qQ*ww 슮cW3oy,uy_2M(y kބš7š75:ɔ͕R(0`A5R(A0VYQ&ICw0|w \M"< Mz@x^i5ۼs0ѻC|5uб $SZtȃ;'ewΛ5ǟ5Vo3n3b2KtE; eN ЅjzӐֻ4{.řwKq&{t_=cqx2κcjI|"BùfkY.y!sφۄ׷əBh֪s 119\Fvp0 i@QvizfFs7Gyb>ڸ66Oqe5&FyLpU& XN H32\mVibW˯8v1znbm.5LjdaϾ8̰9Ս0yrϖ-F5w+Oi){3Ig+F}paG!H%(j)ð!F<(I p1eB$6!;N`5ڋA 9vr$13Hiɴ4Qy)c:Q(!XLJNvHi[s))2ZxTT[UTqgİ4!,ZA d9OB9%TFs+"l[Yo rsO*yRVɓJUݓ2R"8cl2v9Ss+u+N6'ws ϭ 5+Bѣ)*s<,N<8@] u Ђ" Ia $_3Wڔٶz.Sd L1𒤒[oyPI V"SNJaȝ>GVqA\=ȼAPr}ЪZTH\62{ sT$rCiW]{q0كOTJTi&i9Pa`c+b=ad#0Y -Pp9g"w`2=Q2q}51i{avt] S}Ǘ` +$ߐ7>~b:TDfiSw?\hO }/ޏ?y9O,a;BqPA{W` uwcƓ&8R0O2ۃ= J{94uM8c=.lQg~o}wmu_QhI+?SO܊G6TK߿۟JҖ*̷67$à0Z0$~TaAgR V~2܇m &, >N(-YSBY܍BԄ 1hSDItωj,ʉ>*_ȇ/uzOθ#BTKE7RmhBD7hR44(KAq֫P易.)ؓ5od*u:Gěp`{<ƀC:g|aՋ- zs=bɰRKkN1lƟ]I2J+jdoVG-T;"Ie')(U6{*i~pݻYJ8CwI|;7w7 xr/VS|Lܧv>f#OzC"W3N@Sҹbd8^"TWr;,_?/Qf:>~"_KhO1yeUq)t+ uJh"ݎM=t+)Ew)R06 0_7Nf5bӊ j2㲢/E;/@އH&'Q"І>?j[ҟvώT2WT*L!G=%=q"h8$įgdYLz@BM!k2IfYxi;-vuE%O:RNA:q3)hSш/?b ߌ|=_d<k]>s%LFw=?LM/^,Fß~/Nx=H܃t=H܃w=9rilw:x\KÝ7!L?ܻ&J~󁧺 Vy`ƺ鉮HM07^[sU]\yח 3K&ķNnlۖ+5$bO8]JB85&bSS6NϿdO^"ڭpLo흘hJ<`zk5㵗ONlWLτ*{y{7y @}0~dt' !L~L m}4Dey-%k;sZlNL-|ٌԊ=%'-Mj[2-̀ ]9{^.ncٚGsf 9TrBP)92#d>u~uae|dQwog< Z]Q8Fнa|oVR-~7el!iX{Dqa-ZV$h+FlWy_)n% `X( 4Ai5W؁<(,s!%Q=7*N,lQ&T䶢N -Jy8t9/FP¶L(E2$jLWY^;%BhDޅ.gn%oN&/~4te`6qJK"vG! xRB; Ae'U.=/PCZ =co!gs|rH;< u$u6 j0 PNg3w JS.ֳ0%a4 Yk3Fmd @_%TqzPdI1tzw iZ桝Z;-Vrwfx GYRU^+W1nUĕÒjs{M129>EϬ!*(xko!t%])i*/X=SRbM^FX(E*V)]-n1jRx/#3R⬈Dt6M9`"Gp _O^}BE( ?) .f7vyrz<^9Qt\9 Ս`QW!P1leXYǝp2DL,w1 kgagtL8娌Vbjᑖ8G 9.ܠ|trWx' mS aHpbZ‡{q+WL% щ@ D,2srD8ֈ#$ lE!,,2% ` K|5u%`*~)M7 [1Y1L{BT"L%JxlBysK9#6H#Jb6b$pӒx|7ZNGdAS,*WN,F4FAq`DΫhr^*ELbQ2X(Da'X;6U62UE=kQY@AZr M%RBx (_cM<-_< QjOTdJ  |8VgI1c [6٢}_oPojk~ ePA ePA=¸CJ{zsi?cm`1aaoMkD 1#ii>B&Uf^1+Uo^;\Y-_616V]wc)U&s>/ MMp L(|UZ|Gdr uc+fL@ N 2T"'J{ϛ}2Ց{1|?/&K.3J1F/C3af?`]վKi}J>qgLU&w)a|Ja3]S 4?z"ƬU?,+׋YJ޲}Qm/A8a5yh]GndÓ0Svb*0S2jg+k۪Y ",{iMVX [ԺIkɗC?9fނWy@ A"ffZREXD͵xLTek+͋ĮJTpN)Ua ݽ z"[O9yHZe%̟jbR%on-1fzGƅtsG& UT*x4Jl jmime\ Cvj$PLp0S) Oz-lMuEQ T})Kq|2e昺˪EsOq 'M=^\i.S d9$b0@E(i@ў[#u* ㏕Øߡถ&>ȺX9(=o>mӹDY9h%8<ԋ~vݹ/?P?0<8fQT׷o\ ,(Loțoc<'DUdW1ޏ?Ɠra*X־s!%`^}1jvލO/I7 +arSf8&=8dȚ8+F~z6h]{aL^㙗K0N].= ޠW\]8!^fc4RI=ꨬvb'=eAi&/J;*\*|w>MJȖ˫}jN ʫOzGMf.Ғfk D,1k,@4Uq73G[ssQ=o8'pހ^=W*۳G!ג;y!nkyr[yr7%762/`Hrހ``o6q'aD*RT.g?ܯwc,9aC|nw h 2"AjXˠ$Qգ#4e:]yv;hyf`y^S/QhIY yՖ/ZmӢws/1_e>ۚv z KV`?o6#I6aq"ox9׀{h~8Vۧ|B{C\7|ȳ>:,9ߞϦНƧy1;| $$jLU ℐD #c~cod9zbwmh=:uSp>jQmid>!oztS~2N kqK(~2src*Fz@Kdvxk1`)>v"% poz+8AOz+8wNۢ>RsS&)]좨vSR 3rFn=:;3q#WJgR `2C*(HV9-Vs+r" NOJŦRicPBgl?gH^[ϹE:_fNo'.er%&* l)-2j#-AV I.5:J2H(VAZviKz@2gCg29Dr}b ՀN%液7=M\AY mê7+Um˹d4 AzmA@N&ٺ@!>bcMAE%Ud*y)!x9>U`)&> J C Q*ehJVY.Ul^٥L | 3Y8[/ ͹Fp&!Ex{UNs83%49e:P:|%HuP&Eo>-33}<o=2}O4Zb27;~Cƙ|ҳC(cȐK_yPxH [b\6q]i8t[(wj, Vcch7<`x]wbl `Ukqx5P/e>OXkSp~ߣZ& Sew?l7(k) / =4zX,J!Fӝ 1x&jPn׮o;CO2/;(gz`X᙭z3if|R<=tt[<&!:1lng;Tf|{ Tz )|̽bYgFL;%=bB1kj7"5`ӨP/nSMpp!=,CnHn4fӻ*W`#ۣك|xH_ m l[oAKtpӷvK n]H.eJqwٱc0mOTqHOoߺ#DH.Zȼ"$~j>Fn1&ݦ̽yx+:a0,4Ȇ]'ƵJh+f{bVu .8DK{]-"}/@_|*,S5[;0'%NdI'.E:eW,zv+ĘblǘL@ϠԴJeɉ"%(qW.Ƒ:vم]q!_QDSeѝ(XEE;v3o*$/| |:rrkf,'gb/Oa)r0;f?ԗ1b lzͷ8;uo¿m߽ͦʷ:LUcRʱo(hciKLuv7$Ƽb\Uv2WQl:ϊi$HOi ⫛,w(B92b5Ik?DSX_:*!@A,t!FTO4e_7cb0N_g!~] Y@>wݐ_()p.׍-C(3BSP3UdVk䒈i12|bj7kbzx0~I iɳUM5z>!C) ИD!c̙=Vi ]%S,1WկҏW^ˉkh*58_MFWUpTdh[ppPvm"G ,0zpW=xrbያT4+cS]X7l 3O)a|}EZ*S &ƢbL8.4U4Fj(Q*tiE*l+5jBz8QX]RVj4wNdz%L޹^q8]WL&bVW0Fȍح[6MZ(mY0ٞjMH2y}>a҃PBT bHIYf+t: f[=u:ƢeJ`qE^a+ EIs]GD9ϻe+ lOhob[;߾Yۖ[:G!M%NUJa]M҃_$&FH/XZtlu}j/U;wWx~|Mg΋YٰF__\}(fWj+?^ Ww_$LL>Λc>y(;gb߼/ds_Z_˝+x 5aWC?.FX +{WL?{3)pbt,v櫞0:}Sr$>G}"}Wkř3Sb z)L7ֆr|=Q讴 "M"hq%=4ۨ=kbv{ (x,&m<'B  aA ($u$Ф\/,SS8ae5)#)(`((r.1%a5ߒ|h*7_|¼uܝmAo}'¸V:{| h]?hxx\oXD^"ROj-舅ymX:Y<qm16Vrj?Nbnš\Njjwx  {5kTpF13,𸐈JQd9vOQ=O^H7"EK38+&(w S-#T6eTf]҂s~ j΍9ռ7WɬP;ؕ;q5zd뀶VWOyM}QM훧/0#)$w:-|-hEq "ƈ\B藟UbŎ'ЋC.@ z?^jAhR߯?gM1<`D[!=4mmu8j:\\Jħ(}K4@Sv~~3Kk-O"a`2)3[4-xdR8sTӼpN U I.P4DG=TSnj r2Z(ۃ,O2X⬡UKC,O/"s"_őb]qHTck.lr zLvօ/ϼN4:(7wiB NOd,m8]sWL4(okO(>;]8Rٻ:Pm SxBa:dt{|˜o5Sg29B'o]l |M !uowAt-k(>|0vaV*^DxMF"8:5+ $frJfO@EK03ujfSzڹ$/6Ͱnۈf8ew&RF6Kyީ o έEY2&!́2CƄ!.-r|/_A PK{,O|!HG|rDz1agADkW3sO}fdpm$~2D0Ӑܗ@*C٪PcEW-*qJB<тulDJ8,HY R_|3Xqw#1a^w$ćA"78fucq7=2Μlܿsx1h{l8Է,`vwc{;9x^ ;hhHFX R\AM0M](G ΒфC:9{bG /:*G "sz9~KzqRJ [AnGd'a!gG;{ w;HRa-e5+ʕZ,WbR%h5NeYjdKC`@U9R6@:= ۧU 7bDʪ@.ƄcdA+VJ8RK,J"UE"TedP:'-17@1 ٻ6r$r7k,S],n3֍d-9W_mKn%kLEVXb$GvRYԎ+"V6y5K&pgA%b!\ŊpB<5|k7<.6`!ډy# Pok4Q-,:.EF.!(~`!I$A6TKIԖ6=>jx3k-FbcObzGl%j+}74dX6F%[it\|3s$xġJCqT mlR,1S4ԛ L-FlآGÑ)8@:h9& ;*@)NDAT)1w>jMbPM@y.ֶ?+l[ۅrm%SF ]nAhN9hf OEYڭj.$䕋h+t-ܱCi;G;yjm$zn_Tu!!\D[T@-!hN9h Oޚvkڭ y"Brl7r\C%-ׁr2ޟpLlNNWdOjt?wkd_!UܔM:UQ✤w*$j6nڬvN_h]]nm)9sЎ [dCsҗ@!)/\.ĭV]nVg"k'` Zh7Yf4 z.RvRRGI^]<VRzus*ZgEt:reTHeH!XI *RUZy_\Ip(UvgdŠYX8_btG _]f =7H,B~IAQ_߯tyu+<#6u} E/ ,rmY."EKe mX^BK9 pFLC^%W@ "=mnohF]uy6EZyބh=ڬ)G.D 7w+Ø0RlLZJLGRz4 s#%JÞCq {sޏM$lOi#9[%LO?kLCCGL>E|Tn}o/ƣ4Ǜ1~2y;SΥrW\;P . Y1]nCѧ)hfgC8>=Ii6)Q[uQd\L-}i:`U׻hE!_.?ga{pK27L/bg7_ˉ{s7EA=5Bu`Kv5̂J%zU0& ! DzAF,^3Դ.DK^J / 2Ih ՞Zj%k0Vpfߘ L!Z Z񻽐-gKmQ"v2%4JCÇkT$MRBS QH&3lFfha#2FMNVKOGS29|ʹӼfVS1 LGO83a")em"RhOCN19#3Fmf MfB3ko7驦ȖSU崉 ) 6pefh\O[,?J*bЊ9Uܐ=C/\EeCRPuز[^h ڦxo͑BnϚ{FT-xBj'6!R>w& hȐ # 5ƚJsPZ9A|EpqSiFOѪi_]w.`iwKY#@6ُk[d$/~+ xb}e!|gdfKGG_$W ʬ)!gwhz^BhTPmw9}(v- 5c[2U_ֈ `}>k:z?DK0wiվJ1͡˰WV& VzztRڂPj`Հv4/q)#B,r=DGc roжZm"iĐC&Dʃcׄ +@'˃+ < ZZ:+˙ܚ/BvACl@.xv=?˕`+2!W.T.kaϹg6k֖mXy5r[P6)ͮ{  -4ذ]НfەS A!@Zm+OڬSr?6+7|hAc %Ώ/J2k;+_.75q#iv!o.hB֡G{No rӛHc'n ܗ3  afJWjNYߕ0շ/Ѩoe Ϝ PesP~r>&0g ? dՂYMf!08چ.v=e u "+K@'Q\o [ cfC'2mF |OY1Ǟ FLx~?.鋆syz29",`ArFs!l2%fvf,Y *N4Py0&T2Z1=;^R5ޔL7((uK~gndq~gC i׋a>ڑKip[u^QaU:(%u<=M2ޢ_vd\K?Ϡ-l:qmf5IZ {[fWc ;l;C#jUv~y %wSTt0%(GP.45D"s ޜԏL=(^" H񡷢&|`p<،#%_-e-\#㎰'t^ ];h&Rx9T})զy__h!7$qҔ&1[2 7C\51on.Z\B{ 6I4=7P}zEFj&Fo';>]{^ !FON_{;_^g4F~v jMbTQq*+?D|:' {?ߞ>'3[ގka!F%䃗XLZj)RR:jc+-[Ղp u;'}:6Lg7ؕg4 AkEM>+t R|~;UDHZޮG__g̃d47IPL/'>>Z8ń0[I߷:s;nT+BFj8S]@%qqsǂuE=WQ=55MG^Fm'z;y'U 0Pl>hvk~.FzeXRؐ$XdLW)8 NUaS UY7KYW $bz>EZ1A1MEI•UgyTY"n\~d667-e-Uoq9ogq>F .‘'TaάGV8f-w|,d*A·xg'/c +ˁ}?w~ [cfGN*P4xtiL&Ӌa=(9 `\r9Fӟnpb3|3'8ӈP<=\Iƌlak Ȃ/x4"HyH ^jkLp4҄hxKc^r"UNRJ ,[NL>P In䴲*l'9L*Q27q7CdyFs#Wl!7aA7[އwFӄJu^ WTp3lDڮnfuDI9ڳpF .]Zm^&;ZN@;LݻHS('PI8.My *ǯrǯK8G&$#*&5&aXԜQAV 퉷VhRr >-Ven=zUI7"ZM 81]Ժ{Xɽ"Bb'9S+%n (RN(~6Q07T+S}Ɣ,eA[(HfZkz|ǚYkOi53"O;KIkS3"l} j(-$ĸ qч_?]%w:DpUIJ+!2hBU12 8w ڬCd|\si;kcD(Iƿ?-Je?m)sZ&N.%6*8p@a!?~}񔴁A٠/ZOEz?ybb0 =?pI,Z( (1dwDkb%s?1 =\Ƣ$8s(ˠiJ(EZ-T\t0ĆJ4B4T(ڨ5kV=Wcj3k N `))Nh54AH"Te%tz9@qs(5 yH BXNFKvJfkPX#d(KMLF%@papfpFJk]|!aP<LVoGќ ƼC%&WЌJ(/ R(:B`r\GQ\A N=YN&o橰G嚰ddU6Y!CHCW #<&^gۆ`e 5 ?J Xy[geSI5/DLd!e9S-&Q ڈ0+`ZJ^%,^[e%0{ Z f[دa뛫>èIdNGjqLp|4gOG}T߾ywq@97Gc$G|<9HgJ AzM41ܣV7+[*S `Xv軐 )甑hm6[x6G3y_p0]~MtM?vE1g/?jgԖlXؕfI =MAXbHI?{W6_je#UyJS[u̾U X7d'߯AJ2%)5D@FGj1O}Jy&@zWSyB=^24 t+ k4PQJ^#ˋе딦 3J*a 76U/&eZ3 !q%T F,x[t{Xr`.n5xb*"݃"0"1W,ta,j kD 'VV4)+$px[0ۤҢoݲl&\avuq{X SCĪ% D սaXZ)Ӽ,8SòicXIsH)b%ɐ-m-{)- B"-u%b;Ba/X0MAV8*\KJ!TcPp3U&4n}A aϠjuLPD%ZX6𖽔ZђASQݹm YP&3$B9<ӁL.Z2@hɔ`1R Hr쥜p:]Ar!33?\}'=k}\F4*4/9;V <O$Jtt%,(CxУBF;P^7_fQw 򋼻۹9K2_YL4G^?_>wzqy2˿_FMf-OgDHwϞ6J~d-*r7ɮ9zdbM+vA-Mv ;o*4:TSAe_S&=r5=T);^t[>)ez2cfE.7(ǟғn|Ebh}Mav}cS>G4/vg.6)!Bk;3*i5;5)׈h!p_6u^-&c\=&@I.Mm" ԺQSMR8& S ,6=ӈ"Vy.Dԛp8=,w 놣ܬ^Xŋ Har|ZaPĐ(fxD! #m`UB>{2\kc),ℚ&שL"!JfSo>/ݣr5v9(LӰ 0_cxF m( -;5AKIpM^k$P1^9&%rVSȱ}Ҙ9.Qڏ ωP%״ Ip  *H¤B:Wb2E.xuVF^=Cٿ 9X_~Z-joqޒ*{+>n. ޕ7P ZH%#C wx$^!Cr'q U-$ U (o)y=8`iIXjybf)VZ$ZgTcUkdkd{ЫTӰRP&I85m=gimF@2L)a L]Ӯt2!U⧐>:G/C8a]QeBi"%v*IcX7[qZ51֬Ʉ+1j1 WJnaZ>6#SHmIn= $(F vX O`$e«+젔 Nq4&Kr R[)nC+f2mQ)/mERpw0׭ 60Tˍ J/ ֎W cvq4 WS=xLi'j F]z|QNSYYr'ߢܛM!|SႻOVHg X~ {%7S{H0dVK^\%aB[ ׼jK ¢L*ά0ck/w XוtmL֕tj5[*'V 8Á(횣)Jn~k_vI Ƨ2csi9xέfvPp8ԙ=l8YGU(W!vsh_}C=&% <$0&ƊQCeMQyF ກw%&8Ob()&7^FZǻV^×;|1VjX` yua*=bF@ {8=< ~aaY<HUJ^q3 0yz̛y0OS$XojW76oǣuy:O<[.&hCף[Ti>WƵAZDE Q)[jjnRP͔q-x1 nЖxmٹW^yXNF^@&#?Ѽbո.Gs2QhOZVWDz3iQI7MMybhPp>Jh蝅A6,4d'B K=0TgpZjQgXBu^/@;?ZrRSzWM:=o#׋N|J#ڧSL>8嚱! & #JD !R4qR-gCUFL۔c'TIinG B#h!O\Y!bf.B@悓y˅TkC HH4wX3$%:'2Ɩ(pSY#]B)iŵGYxOgiϦ=t&<Ad.MA/U9!FjfҐ^7F:@=RJ&؁điJ@GNiN OL1IET(9C)V10;&_&ZB9fZqb*\Yb4eRC׵zXAr GnsnVwonf:In]Vͺ1!C%V3T+#_hCN(љz[YD-~}W?˃W^NS[UƏ1-p@"]HIq>mX!/嗯Ol ̫ƹָm(0R#&sY\i!ĮeIek>|>oc5U;a]1V%|ub?AN+C7An}6AuWgfbHacFֽ͍375(Jj(;Pok=YIn DtyJyM,Ԝr<$VX |j;3i&G5LpKf(+%\]NѤT#\uGH1?GLa9WwB3|_ڏ0 AF_ޭ!ߠQƜg!}?lj_| ^U27-pK)|*`!}@p c45XgéRXbsLahltx0 77c:]؎ތn3וC &qF|֛5?>G;9qѝ,!C!$3&6.!lm<30玦'GCL(Qox-(i9Cx7Jqb*__ZrVUꂁc\=F إM$Z2uIgV$ax-BrH8-FZ!ylųDx2k1H(3oelqkg3CN1lnL)nqlScR zNDy붟*v: Ji` )76u & sc(a\0;FPS<]>Zxl鈖ψ(#>8j)wItw{[)=, ,Kc,9;"})Yb5M8٦Sziv=.5St}q۵,sāQXNBSx]t~a/g5\>\X"|~/E.E5Ǔ[?k~x6M:-ϝlY0=Z;r `P85%P g.W,Z`і F 3J9zZE`6h9jwqGMXP$n1ky=,R} ?w_A}gf)Ag>as!LN ?g򷟾9t~6_bC JͿB7smg?/Ss燇<> M64VtOߞ|'LT9 -RO㖓*Dp8˰`{ _Ca6: `OĆ9Lⲻ!TzAY 78AfwQ9 C_]Wdp@:"_*AqE!vE!1=bV.FL4 JV !DFY%CjM8ZYqAuI8:cDB LT %JD]{PY6 r2'ւqaKmJa6,hƦ0Qw!X_0^B {Er [~h[,eaR-0j<ʈ֗ Ѫno*CP]R輦aϽHEˑyß}+ѐj*GPGw %SM{P{a@i)63dVF:]p)b-Har6;2VKT+)gE)}i1O9CH -0 Kk\Ʒ1Lԑ̓r${ׂyT& Pp@Q)~K1[t7EDQaa5^'-cs2˜LgdF67Ǜ?qq+aHryҕw\Tܼj+wr7p'1cS6{ s F{kMfgtG1Uو'הSGhhh[@eK~yqS%K)?.~CwU7 suOE툒Bٚ5ъT~ >[' ̽P֐܋M4]zBc6 % QH,biU{&\͕+6:a:6~TTA#$JKnQ2;jQ+APw9Xi6%;JALmI( :坦 C'ʺBX4iyzez 85]'t,(Yn: 2٪JoBQʥP.8n;)PN%[b*r>sF99kg'7~.e,Bo\_as] r,q0L?jO-q``kA%EjY{-u`g3!0^lM,`LKq]jOiY+ ,sEJ "lςr뵭P韏>Y@_Ѹ|纰 'U." YWnTC=b s U2W^Z.К׿..ThbCJiD<}~>":G\EHAnP1ehnt~=[veE{V0LMn#ENU)!FLqlj4~r8A =nkm4R[ ?}>ѳ!v<3!?tl^DhJ^j>}(#iiE3le{tcObzJ*y/*n-FB Ȇ':3=*K` a, L_#DPEz&y;W%#XOf"8wJ~rp/|J,/z?|_Wzml "/!Z. L(tOMɧ;=+eGiUDHk?W^x jQDjS&_2R~CZajM0(`*TSzP/Z>~j^no Q j @+h3V1 RxZkf,nI?qjO[H'*\M`}k$BjʉT(Ҍ1^Pr(J^yfbTUU@pGNaR3*0N@+Dt ԝX0,bEu[8YHm0Hd]| p"0lw Wi(*v.Rs;50QC{T *80H1rzCu^ ACk*޻P' r~M~$ 2~a׊10"ǣ[*z;-˚4G[3 dxDG5H fr_%rĝs倀u]j@|\-$$L!tn{Zܘ(HMJ$eӜ2=UO$ӂ"SJȡDN p> m#^y!yT2hW/8 8Wc􆚁".An[3}ǯAה3/-] Y#Azov>`wŌOgn8 7&uwnB.DCl!}NȡwΦՀ- }F6\թ6ڻwa!n!6űfC{7u@>w.י:<ӻŰ 7 RhGwM0B11gn -E}[ plS(S_!BD^qAWzvxlO-VZGg;p*y{3,Yw=\w9yx(w[A:1B^'l/*6ջX ca-wh 5,;# Ąm RşSTu8ugebOApT}\޲>Kۻ*'4O- ' ˶>( 25RBӜA18FZksu͗ln<}7>wxyMK$Gzy;xyAONv%QK}QهU<++^2d["hIA`rUnQU$ %ppH?p:O$Q DJ\Y]Lk dUTLb%+Ad@ 48Ig^A"FT $kITE@ƃP,8ǕLVRgW:-8A/N_?5Cns$6is|Z g?<61Ao7_/_<3>j?+)/`7=[,W|?M?;wrX;~zV&,jߞܿ BGB1ABL~Voư,eG @51_/ƨfc*7_eDS`yb"ELhr4$aMV/&?12cꘟnU\]#J8bR,R(ʐР\@2DQJsV}-JqfRBgO]d̬ tC. sR Sd Tf@*m"W<( 's:H`4f9_gߦߩT+JZObJDF4SHnm5?K&+͵/c771"0%Zo|*BLyIq,nsu,g#ލYi F 3 S˒o0z)3\MHXeYW<UMnm g"z'盞"$(>[tb3wb 2ɯ)܉+i!H'װ_ $H>%)JS2YB5:N8VW[z_cx[gR $Yי mhBO@$q*E.-0@9!ye `AB5Wty!1UThRi%ܪ4"gYFEA@ :9I )/@κi$eoUnj\;Yrj-L_?[owhԷ\Uژo֏v~;X7w(i.Ea\SU̐)2E2V)nC [佥{^4ϛ>dCE$Ao jgV? (9/ےM3X̊a ׭|>kq)@Bq]qUQkB0W 4 1(9 RK\h&;VȬ\ؾMI|X0k/4i*5$DX)yĐo@T"n=bX02n>ƛ |gUFF pRC-JxH]#J@^^3^@qJ񖡧92:ޫ:O1Ev7[ fN3`&{$(cyaP W{cFvIwst Lϊ::+Oc=LpHKB∖G@+( QA P8V8މP읱VP#*zC;?3O-2^ء{yrTG:(/.+54_f>ib>_q.f(4Ys<-2>1CNJy{\/rLj-E4!=!>N=F9}3ݾ*[|97~x2Sqbڹo}]w7~ yu{|R[fIꨩ6_ݻ6qʘ5feTZa..] 2*TKۀv {֖Yrʃg!¥dӊD`L]%Uzc *= T!oPTQ e8ܓC(#T~,4> ة~㑼 ' U 4d QBZAǵY:W7qv4~|%ؿA%D/']pNwݼLɀrVZǧnVX 9D|e]9c =!ݙAZ#kk 5WP#IJ؊lY<k"6Kw}X}o᤬xx+Cr[7o!z[_VڛGf\g9atꤘbdK;Or\[]1R|=/N׌-'3nw\F 7rǺwpot$nL-\v)%8x]Qv{S;6?ƪ,r5Χ*tZJ34iN]Mܮ/͢-{왱]tSY^-}0Mnj̆IyZ{Vԧgю{cSv ^pĒQ WGYRS^#H֧]JEcb(iK]BZhi^Z 9(}'.q1\*e2{u. c3ņS7. J `cSPY3+ޅsyÙc\~] nԛp7 H/1[9c"²X'8ώ[9dJ5R X羣$Vٍ3nKj/ /ď6sQsH|dc-)JeDa$łs\a. IYA0ݺ-[vD dLZc5/~gLrf K!܁\ti;KHQԔpS;u1,M4Ħd([(>6mc'Sg++w ~Q_B.DlَQwsOnN3xNI[xzwa!nmJ2Dp-:\W1׺(TxjaLi I v/|nH{i.khǽ^c私hՁ$uItXF 8ٻ߸q$4e;6ߏ9`xE^a/wW`yo }+*ދɿ 쫶}:&d0e ;c] ?t8wzޮ4_ mhZFE1h nsmb.9z`ВmGZݶ}e@z(UxSLu_Uh 4ZcjQ~ OF&1THdQ,>6V8Wut<?_ &,~T 95UO'!ŇLďhXЁc$숉'j9&lWMLF4gb}DE0v !UJ{+hy- a%bcDKϏ|l5>ƚq؋AJ* ;8v7m+‡徹vG.F̺ȗ ,^5IzܻS9_3xNnyr%Q0 Aȴ&==7^E)ydc,LCOQ٠|fպl]>*:;_O2_0)$cK,/eX~0nI*_SqF NsK-<6er +n W I~< _?^.|xxexѫ:Cl+?Wn;H|b;o.<>F;ɔ!4S)4.%&X++p9,ئJ&F!uI=>!DNQdO1e)RbҔ*s8sHfW6V ?)8ؤXBM}bI^] ir1ֺtRvd[re=|U ݷM< T *f}Ry73W|i ADDH8B9sӛ3سb5(g~&Bg|ӺU3k>,Sh GO4VT`0%D bl=/Xa_U6$h,B*T !zbHIIEfDs-2hFq Ǥ$@BK̩4)p4E H@k"`9aQ;N 3 ’as#S0VwrD^ 8P%ύJRQd4(gֵHeLQ-KTjuRhZQh^JZ՚]=Y+q A[ w-a ~ٻqKfZ [}15FZ XD/EIxs s:Kۯs9d:^؀?dNδÚy2GWK\mZFᭃUnam/9<. apC>pkT*dgn߭.?AW1\5桋r`c3M_w"n U\b~4lWx5x6UdGﯗ_&6{P9h's9*8iP+֭4YnO 3`g.o`ٯ1hAZܛ~}"}SB0VX!BO쐏Y#m"D S=7 $t%%ʌ0c0 ]PGvhO!˓p@/zi{ (DHThP&r敢w[{b{ӖVX[DDR6`? #Qt𞚃7+.vG,AHժ>V;<岷jjmH3(* Ղn39AQGqW<-9uݢڐg.eJ\ٍO0a8:wK$գVD5 p1fLc Sp,*"F-˱TkSc(PC֙rTbIs+\2.}6y+ٰh?G }Pqu %b5/ aE4&M^^Sp]zW >j=\$BsBeGdr;ObG:8 >&f}x^9sTB(B>ˈ#M5h+MlLF2xc^/Gc;>/#!2D8I46"aI$XBi\zçcJi$@=I3otVaKmj4%PI 6)P:o(j5nHRp+_㄀p2 uj-IMK"Q\<)]~`peL[3rC2\L\.*p-b(SY =)-l,c) qys ͅj g+n' BMԊGR3B"b. ֆ!9?"d=LǤŬ wVUEM95`vb[-`5bv{X[?^}΢n^=3Y.wKn1j cWJJv'.tRɴ J+ 3_me $^W=`AW9nZvJ2T {,~(?O=4Y77fj?/L."B9p{'//%;E:lΗy:};[2ܽl갂 + _`(E} H'i0MlCܨ5//0ZdwGevX.d2C'7DA/0*//ۨcӬ֜~!bTq򎒟gX?:+ewŷNfa^9d@ɡ^]@R?Dliº$s` +2挳1uutP@Vz_FÌhP*ѢקnHn1E*LJUUQU)mIϮ=NUjBI9DG ZBp-b9Tqq"cyn y&wi(ō<ω)1)h1,m4#&$N]"#49lQP<!uNj8rnb{ N`DJWkoc֘Q2L#%l2"qU^LAُovHYt-/o|Y]B3\o6~fܺfSq(y5wynW ~~k+@HH.+d)VnYJЌ+ @Q&qG}Ռ]Vms{0DmDBQ8 v<2bAs+F+nT=zh#"]po->UEvkCB6)N'|J2r-Ὸ$v{ X]\Ʊ.q)WFq2U-J!,;-B>qoxu ) qA82p~(#?y6tu*h'L{J_)hDԁYx`/;~P8\/k~ȵ L֫' iWe<=A`@(\JX?8^Y9y;]D[ ojm4}}C4ƚ!k)-/DnDK?)a3˄ڡ4#H,I13on\yqe+GsbL-(UHvV^ȓ%5|7O=l^z@!"- -jf͍ +.\"AaYn9/ ZVޠUX4A ȮafZɴsKq134jFC:DjtLpqjrb9g儒\k,w0)SدkT.rh T.#vD'wTNTHcn%!ڕ%:H {Wt($TY,*APr^? }ƆPhnG9Wr61v9ϔӜk;2:hLN9ʸ*)BMnttsm "I,F?9;/eQ0B8)zU-?sm拟J;5<) ǂHLG71V?{ܸ׼UɃ_S{WMej7/Ipr&HMI 2h4_3~sE/O -7Ra/pGgRk{~H]7HZf,UOM3B47 Ze`1gr&:ܦ]я7Ѕ&9& HN8} f j|Ȉ2 `S94W,x%@Ze%# bƚ kbkkYhd iba8BJmPiQL"$˷*S=9sփx/r᥍gB㊊־T%>gG߆~S.rKQ^fڲļwPʮ;,˜C ϟvE7( +o/]k_PJ=_.]rU"|.EaYvea??[g]u}pi];w %xUidWJqPE)VB (Q+&4J-buП.\yC]>f?d띁&odvsUbR)9݉ "4%R*∪&aTuP䌲 2|n~A#8 鶻gtFOip޿w(cΞèݦzPPz%H'LR9CL]3t|K]/*yyGKzBqՓ+] F%]d5 "Wj{G#[1~MΌ -e!K FblTXɊVk +@B  T#8F^n@}Σ!\Ee!a ^ʘUԤ IP#duBU1FBSI"jZ$@"Km$U (nNU$B\T8H|J\$-AnF[l#.A6) %X@Dٟ1\9n.$丐B夼,pZ.\-2LB׸]p!">DO A&`1c&)&*NOI GCj!{IQxtIwjFusaYE>#㌑SaNaBNDlB+}kѻbc:n3fONt-zm 9qlS+ )- eEmy-RXv,՞D~#7W`x&0n)ړypve?JXʈ\bޡF 7P"_qO&r\&b8a))I!KN{yp+c$nrqrb*+6i}_Ρ9>wj2xW6a3m fhQT,.mYp(YއotףzI0іwdZ dkzȕ ^k4&W.Ww9BPAfS9y:.f1h(U,O59>Y!nW*7*0@fsl<'"81:`3Ȃqmy6~_Ul] @i.<_c0gc|M1;q9Db"-$϶=嗈le SD@-4ZVJduʘX2jjSZi@ee p]5+@f $U- ;YI@ u|.{%(ovS b?m 7؟I}g<?4=c&WL frM YFiR+\bMk(TjSB@joEѤ>|C!\`!&9!<g.8 Am@#@0Vb j+vVc"UjuVBa%.iQj!Ia%2*05XpZZVv:%dEQ15MV#*-D`DwHc^{d޻+x `v*DpIDRcM!wPk˛o8*kV}몘oSIL)r-@7?988ȹ R>~2f@TexǓֳ:Kʙj$yx-B}~)U\M%߼y,GOՁPcJ57 S`f'*EZcSHlX~*ny|Yߞu}O'6ˋ}i|-Fv}iEudf] D~}>EΥq+ΰ[^7/, ^YT/ߕX䍻'.( $u(hq7szYmcƚWԑ SB[2LPlz_ Q#:K{ WksVl#ΑAjGveu&g=ePvQ(MJs}sg8Fx1GZ &ii`9&Rd=_ml7>qXw~ն .g 4%z9sx}xY5y~twzgnJ}Xs>E4,V9hg`y߮n5OWᅪt3rΡ+um-O&sf2anG /&UAg}W1]`~s~H9gl9N행-劾E>?kic,Ho:ca"u(6/KXmzy]չRGVT6|8/>Jy~!m5G<(RNySmOV)<)b$3ٚhTfՓk]}Su[H/`xD<<$ҢxVR\OIE RB"p_,ײt87~3IMOmqcv#lJ6|`K:uּfr5>Τ=SӪq_}s1*;m';)̳: bHcav;造RJk~{u?ýlgfCiN2]Bqb"/Ȅt>;cP)``VGY#P$* f飅kcІb3C'cR>g͠a:Ą̇̄ d!?D[t`.UK\uڔDhA5yuSM-& D@CA"dϛ^yg~k)m-h'tPP1OR(0(5MkA#*Sڅ$0פ@P3ΠJդPǤ5ILg t^7ܷSCfhzeAZ M.ז+!b9$R2s.! ve\Q>'v0f@ƞ2{(KhP9܋B)427c.K5S )w c!ظqӖ)7!hƍ/7n}qc@} `|&:6'}C Nfn#46 DWXbsJq[l(OFԙ+$%|9{ )Q,@%-0I RX(VT5zxƑG {[ q(Cb9.W>ULٮ-4" +8)"WOjKA) !{/}8Pv$z{C?E.8eANtwt /Nm<25F kqɉ\V *#<iɜH'O uO'DG|b[sN \p0ߞCY̘AOfM(cα!Y~cFSYrPfT'6L͖І!!]l҉$DIs2AE`;EN9[RU"·24SJq7'X*s뀂SbJQ+BE[ꟚRKTضQʊof-}o pE/O2_mX){I|+gz9_!2)*CmpxB.ErHJށF)Cb$G wK,VF}_ddDVD}#QApOt&:o"p10z" T`cw3a1"΢` is^Pu}̀8WUӽ]R%üB>Ÿ]( ͡=xyf];Fڴ jF3ۥqz,R7`d6ž*1Kx:}fUÈYmգ]j5Uz2wZݯL×Iܡs" -4 zDû->/p & 0!*?y/5>bbH()e 3F*it=,ek'v-~<7dg&r:.5,!B™v7e?\tp~{>N 2*[[IRJ$ɑ_6{`*h28^(i<:SX@[cvHBvGҖ0I[G8{Hz tӳ#H+Tc=Q\wHqۦmϞeÐ|~tm8DSry2ns~("$m+{2)F^veTA6g%^̈m] K$w:}Yho|}Y+௘j:3.$4XaIц_bLht%68 ӭT ^bmxïyۇ_blwJ<:ü29 2i""KtsLE0R FhG6ңT$b5 H"=-v҈fm i6ܦp[ 4b@d K-iId 8| "Jtq݁j)%)&)kJS`#,=q)7왇F)j@4J{nqRh0N""儸_8YWܡ >Ma}3l/ :߅%v):vuhC;B `J)?2{C= ()+&&>ܠZ4ZVR;6HuNm̽u=ʡGP:S~)PH6y .`kP?5VQiɹ1!R/ h栯XQ=V8#ņ?{7tT/<|{8dX,3y{۔=v_n3ή[,wͦeXHu.)pCsF9?}|yp<0}\7s?DDHV 8O?~Z1/VhM?uz]32W5w&+~x$HRX9{v/`/@¢u+}gTT H (!Ϗ%}HĚ;u(Ҏ;pxv7BVR.D10^۷>K?FЃa{,U\r)h1c6e: O ؃vE w^ J&WWK P j85X8jnC#2N*=8J_>`t0Є*r8gR9ƄƌMy e B+7A6d?\tEDl " YĔE3X\ư`05GG Xط­U+s{~L"C3):*OLDgExEwGwKѴc'KrE*qVԈ.O ?e߿I˃-[fω ::ʦl|1F(;jTSԒ\kQK}~Ċj6j^Bl_ubm gUT 8+ LzH*i4~ 1FS-{;a M'Y'M- gK ޝոoHDBHZ;Yzs>Oddhĥj(kp mQ (I+Q-vԑJNL^;lummc,fɛ#o~dG!D [okcׄ8Z)X,M=dCh^.|ow_k>h;m L{e3z.zݭCEJ(~["֥֪?n1ŦN-a~ I/a bgSWYM_~g7yy `e7ڿ^o>MÒK]I M%is[/x3iC2or/R{s'¢@#QCUN9ݒu&1)>uAթ;Fv0 ̺~в֭ y*S%۔=@0&Ӏ`!嫾h?|%D#+/&SZэf˰d/m"Y`x&]g燳lj4WғjVc 5Q&i{mV@Z"]hxBcy#  f 8!L5Q y_l Q:9AіhD(CF"Dg%Ӓ#m1"ģD`5BHmQ:ђ-~)HP;=,K9'舶jcp ܗÂgx 889zB.:ƨ"wyS5u&T'΅vod{;'>fb+70 9ɶ,z;HavS@V:T2ꡩIJ;O zRkԎk|Jb2APw͝^S2vw݈V$:P> |jp@u%}OSI){zWx"$Қ#jWՁ\aur5=EuKG>1$-X#O _OTUH:ANagןIL$=1+]Iz\.Zel%R*I+0J񫒶uJpIĸ쮶rm\n\r,)tϿ*6Tu8ZLBy*#J$iNyU.< e T$V.tq麔VyuL[5'l;/h+~w [C6Ȗ9ȱ,o5ED<(eH8`QZ58# }j"pr(IGJ"&"W,C;B@%IC`GLhE|u pS8\V2OBh?ͳ.O \y^Pb]݃|Wkl 00Fw/o7k>'Knڕ3}FeVS&a*jza1}X w 9@/VR`FT$f[iy@l S7>SW>/I. %>doO[EKT'Dƣ)#g`YT"0o$mJaX3.YXJ; Sxv0Y1:WFo?H0dx.ɗ{K"ՆjsӹϲC$byaF pg7fVNZ m,e &,H}?}YdoQBEqT<8(q]Ƚ|>_0DNsM!^lhH| ٗ"-l?isPI 'D||A~GGOc#pECłe2}Huxă`M>&6 ^mG8To#\!7 3?р9&u֯n^̌C; |~MgGM TC 9R ".*C,C1-< `3b{zw,.oZ}k*9bXp0V[v_ e-m*dKT]X{e<+;˒=YQ x1GR<"Y.nS{R2{SSY-C5!p}ߠIqMWHeרO<)=`>cf|"0I??imۤɷIo '()X" BzFdADi?b|HR9R*F/y>\?{Rxu3ֺ'EӪ.7ڃޚRﵓKغ2=`bKpr4ЂBjkI&e lK&bL^KcNA"t+ߏP,ϩ N2wZ`v7  I2V*hMkJᴯ6r*S2Ӕざ%wxm2<:4paIq FQNZ,2VZ 2Pk7<\Pkh,s.3`2-I;i &{᪨"[ wkhJe<@)CD9R- ޥgj㷮$jd@o 'uЗ{֌$ UֵmWÍNZHIj-`0}ԇ{I -7%׍MIJKU9ۦVx (V[ yҫY^ݧU߁X$x>`% $:Eii^wj()0F6\HVM n Vu],Y51C3RiQgm$jcqyB VH6V*RJd-NDa74w-FgZ8җTpzޭ:[r%qYgCb$X;fx$T)9;=~"1F *q.dh&:7`<\e>eYI${u6 @XԐ; UP>KB rU(\r&9:W)wb_?6n(ҪxqBT`RV@\2hb珹\oc7]DYI> b\(JL(ȼW޻vGJy+J:Fw㫛w)E;n%Yg !ZYl:0gbZ#(8UYyF4*߳Nh``;/Q @Z ;9K:k| H90d@N3ɥ ZvԖ[I+QLpzeA+q2FXkJ(qhy>6/{Y >Iw+j4@T h|D-X (H$+2‚"DOAGj.QM-X #R::[XpXu -E"B1AX詍Oo̅60pQz+xaYӱt,&*~_XI,*LWl2V?~F&6ppd.,,2pŢiRs?B#PAN'L(|X»(!^-{B!2΅v_u>6柟ԎXҔf׷gpR:C;W~e=^C"al$1[V9dۓ*'PbcwnO9Z ŖWqO!G[aЯu9 5dh.̗]DNH${ВA;A&7)2srB"-ſeM֣V`DZ@e&SvR<3dm%VZ<Nkp^p;gBC޲\6~۠NWcNP~n`{p77 D)\hU!Y`J 9GqtK , u&z3 ^y}!i;t@b(Sgᩤm !3]*s.ouAf).oxA /lY饛wL҃w=!Sws;o5M.z*юTiRQ'܊*a}^^ڈŬ5ͥ h3H!U5,{~{&??ܔgt:?g>4ӏ>jNcҐSG^ BDn\4qo|::W,ǯ~nlI%CIcpq|"Խ M[Y]7}9Eq0~M@I{{wYi!*}?|oUk1KltX_Bth_Bֹk&I+lJ&- /F|>}wy:3VGs_)|O,-1y2UP)wUFhuPwzdtDh;A5lFfiIj c4J*ܶ`!qMWoB2u,>m :Qb1^C#\2k{I6N0BbGloK~&_f7ezLcq㮱&r[(-ʠ)b4uߒjrXS7-=Ԛ45`ڈB貵sNXI9ASPQ&!TB *T۞u1_4T<؁~ǴJ*R;3a{D֊l;,U&Q"ϳL%QL4W{ާNֳ}/4rPbǢ,DpZP|rA '(4Wsmdj~25eLBSp7DKfuPFLtwr8S7\6/Z1NL}68]OR! &yXO+ Do9'p rMܞzx3Ÿ/WxΕ]>J3V)[Y.h+WG2횮&y<ܕE#(\y|b=V3L#0m:CIo~^o(NGϮx8+OvzYqhf3J&DS<46gaUgɓ2 e$UA& #q5Ӣp1ST: ShN4Z1n"}` .;I@~ 55[L=͐=Չ@lG`J*dFG„P*I PPp}i7|7o̓2MZ>Uh%rFJFUI f84;j=`h{m ;9fu@)23($#CF4?gSk5~r|(bT-$͍/!IJU:77>1 DJ}<:Y+7~)+Kt5VGզhd^\bF,["9peg2gZ¹mwQ\(3qhh ԚȆ ۰`.įLjn9(G46mR FH`ԸS&趩֙T qɝ,?)brVᘸ_Wbߍ3!f'vVuH}=?`" c rM)aVOV#lx,ҐUwoV/Wqmv:+guJxɿѝݠ7rgUdykJSB$)+n$&F$֖>]"s>45X0蔃p} b{bHgkha)8:)N2mQ*|ZNM!QM `qШU{<<϶Nr}ep=SoD$q$VbkFrdmHB:caR_îv"мWP$' PD%Ht .d3: LP&jUZp'-C XNth!SB]ɔJJc=׌FXP"P&ӜkQ;c+tI\/iRfj?Mxwu1܈qIn in`0\9 GP>DFDhmjLޫ۶pAj+eav"kUT.t}_l5YDQ'8#$JsQFv%]# %P) >D뙜75<[x&p*p W "b|ɰh.dvX yV-*I:'V ^bԓ3r_ؐ]?lg68٩n8&z4d/$ᣥOkL.6h^>!Л;QVkw"dΜH8xǤqQ:#ݕؠ 8@hN{~'vPuvCz_$%{ wu:rnG8r< Pi%4J>2kirάO:7X D3y6BH!0C"!pΔ'eH lK 3[@~:FD FI#󘍆P!@$M97SJV!,Ge1VFe+` D7PȬ ʅZI[Ұ 嫣y~Wgn\v,~k.}SoTooo?>yZ/Ͽcx&'M9_Mt;[O9yJ3D]_vjO{r D$)hOHigKYwummo;l02G$~ߖ5ŻQlI|YA`l^|V(tN}4F RB lZu2.z)/c1HNwcQcP\#D ƇN+ފBMi^~6E?}E?D'p! Q{(tء6x</k }2 u =LxhڼVizR1CPO},kFSšm1N5# %}ȃ*dRɤɣAV<=\lE#! OsoC*'1R[&ˆz8Exs+׽3ؘ_UQK4a |$6jlE)ŏE'Mg&]j Z٧Rj ЧRt}UZߜy\wQF#KS.q&޹(,  $I_ZDU"=ET-kd"e}Rr6 wv"0FR-ͨUTI&QbdbdSF!F\bT1eCj.Mj"\$Cb"6 ȧ %̡ZwV"М>ֻA aʸPe6-HMpJo^긦<^瞉$E%5bD5*M[TS)mctK-楌s%r9WA)ǭ`* 0ύ-Oۓcƥ H$pNȍ -c!ޭh!D9&uᦢ(lb:xܥ#(tnj%# A^V+ދS{;8mHݤܭnR=>m.4FkYwٹ]c1 %`Q1(6]NPT6.ggl圮e+Ւ8][D#rDR6 A;!w7vĶGWضIFURĎ  }>fP8qtu P]-E EE'-e vWdHǴj+z3Ñ Whwa^[>Xm_^^ҽT]AUgCH mۍ^Furʠȳmf> Jh\>P!lUJ8zݺ B(uAԾu;ԇZ>4䅫hN vuc(ʠDujƺp׃|Bևp])QSb XRR}%-l%{y9g/L** %.M@0>1 Bj=ڋr$SIəJ6ɃL2 0 /IDջN[OHvpq5yxfō8y֋Isj*QJ$JC'ZBZBDhuBB,'Ѓ_9t.$.q,؜CM7O`FܫS9໒weߵ>ɇy)9fS:^wP,Ec9z0_\e>U0 >ח=HK˨-2]Qu{{/n! G.%JR)i>>>1:l{橕X#i}}]ս- m-U=W6Le< p?jK)B))RKY_f~HpiadI+{)|_x(ֈVc!lD@HkPU 6aC-o#ju\'NeLC̱> YՅ?RhZh{K!;coݸP`EI!:kauEPy&&g5c6'ށQ$jӱl4#rcnZ|MjgRnrF MJ OI2an,jt]%YY?cRWpc"w,x=,BtB3T@oEuNsrx44!R'Oʠe?x~m焮R׾2ef2u>^wWLƛU\*?g)޾*+b/z}T {}s+|=vnYn2J1mL\=/1%yj8B”Xl%8o JąHtHHLi)9I< /7Zk$ F4-{T)mTI)^GFT!t- "ݛ'Ό^e Y SLKqXVi17l=:lLSТ ҃ŻQᆈ5 siaK)aCr+!shNXn`K4JJbH 9Z67W%j:-2*n"xH;\Vb&(f\Nz>Bar^˱il7SK֚:藗yµL5jZZZ9V\IB:c*RjsJj9bi'eTo/5 PDMñjI2KM@2e <%A䃥ApCv[VZJlP|š-j% tܳOsiOO}V ˷)P.),_ 7mz=IcJOkxf( #z*^ldKRQx wmqHzVyxvf9Ֆ-Ɋ$+߷֥ؒ[<}xZ-c FG캝b dJ=RXA)c{(r{!/EoK nƛn_FIr;ꌦNɏӮ"e>iwq\qDJKR23Ee`U8@;0>)ƺ"+|MɣhœZ~ vt= &;zvZMɟg Xx\ 0GVnnV A \q'cɬ4$3%ݞ3Wr™MPȞ*Kk[8Z.~Yŗw_Bͯ;f~8zش2ןN`!Gy >egדu8',׼26|ۉYq8˝u=c;ۭ;-Q}Er/nL5uc?Wp\'?H`! J (V&=OHNÈO|_\ͫEOEuIA]T/qMR.go@JT'Vp_ bR0 4i~)Gp@c>p[uB*9+PLzݖeI? K?ۘdzt2bB =[Oc%K_Pc2>hVI!`YGAR"h-$$H='$q?/.E-էqU؊+#qoqG #MmUr9rX־FXJgtIf37}bQPQh4Ix=BLyhDl$soy'>wфZ[H='-pr\r#6 ϓz(қ"HTY$S[(PD@̮ T2% ^Ɗ`u.exPIϵ\OM@oGNkC:pS[I1YtBzR|$eV5V-,\:SwgY㌛lǙX['46"{)b $(r8QSu-\n!2[ȨAFMQ߹9jd؟1@A 7Oܬ& 񵬰Ԯϱ7:!動ZQXEcu'&PqŚik`IEDa|H8Yzu k8<;Ξǐl@Ȏsk=Jj=KI 38yPwA<3Qe5&sd3:U "f\\~>P?cH4̭DADƢ5q1b~B 9uc/8iqCN(>^>B"a BS) #]^l6Q!Rtfqk6JI[C. {kϹKv6K&iXazcߞCɻh3!_K9~@`C0Otf>Z,XG {М"RT(BdkE̷Ѓy}|*>avMdF .4D J!-+c`!y˫/gwrg>(t Pyhw YR|K#otě=E7iͧeuӸWTq2Q-&CZNnx_^KgG'c!8$ǻ\VnO5 -A8&ɐlƤ`甼:J;g!(%%㼏lOl)Ur4? TURWK '`C;=jȕQTlSq5(!hXRV&r#B2SYrpΡB~s1V|uNaobJP%QFIxDv1Ȱ~kO0:k'!TKA6lhՖk/r!!tIZ,r3,iJR; Qʄ_Pĭ&N_a BdI@^D=$qIs [Qb{d߹-JvL^@hmuI(̃7ެ ߏёL6r(=ei_8VFɡm.JAxcӹCg-m&P:gQ%C>j"7;YrtXe AFGel`<, [ʺJC'#J0h}Y,bl6ɹ8AI9ȡ(mmɾGYdyMY7AeDt:tZP@b;vor@6ud'Pm-e-$)vD$gc LSVNlBB0s`z@65.'KI%MTHKpZajVJ_iR0 }z-io˒gaIf }X,Bk,A-˭Fv0v`ɮ\biYaMhAT] `@80+94I?I!(Zp -m8Ə"Wzn,=g7º5ѿ=Cv}om_mK nWn>!Sj̼S^ sn"GkWnv{b皻 }~P]aZI 0*%tݻw&(~㾳JV*9kd&1nx4vAڟ4`TtxL䀓;_oQ /WEIǻsCqSfJe|z3aVQɖ?KfG^d`3 #P=@'eRfuM0ϜZ47΄'5)M81yCxցlƭ}\,0l,. 8wsSa{(A:(նƗ¦h؛mK{)WG{yedۀw m7OvMߐѥ xFrWCFݩgH=˕ڵwy۵GiJ'qms1=%`ڀ{D99 xHF؅77}f6JR x2wSn9nYS@{ E\@8Nvb;[:~P# Jyll4> 3[׫g鱗n}s1~ӽfTE῍m)n0DcgbHę> W>t;:C%q ʮd6&m[^*u@]iczkOx\ ~Mm(ɻq;T]͙#6oS,()j!g߇Lt˺ebx($bS `Лzɀ^|ѐ!RN-s`\.FE&C Q_"?sdb9cu+ˌAĒO18}.6y_:zTCdP-<%CY0Gj!?'-Ke#\Uh=}Yϗ0EJa }@j$=Ln 䜁({K9YXg#dfJ8;Jf5 XY2{kςX`+8rѓg8un}Cn]z{6z nІi;M=۫eq8=inŐוkinXơc;ܜ-8m84$=ޒVq8/צqhnY{\r fir KLZ ٙ@+,@l{᥏P&/3_b_M;RxNo^?V7H'1yCcY~אT '+7Bx&W &-/ OClˣEv$r`@rӕox~3i<ߎ?ԢIO1QR))Лqx?^OnwҝWsW_R}Vo_/Kߥ_~; .a}bݲy5Ўh{ l^UzY!) iNX#lpځ6zj׹:îl$k*jh`g͸Q!@khZq*֊WsH vP|-dKAx@ *] <?FBDp/{>8txMFL4tyKըf/ T{@57|ヒC4.}%Y5t"_rX6H 즗l&u#ueOdgyLᛤaX&bZ!Lջ#UΑZX6̟~z;KfopKo|I3fik$UcZo&oHrZPJ]0' Â/Xg,5?D*2S,3<&*U!qf0)/!zۉCa}ˠ9׌ҞG=c v(el Oefj>|K(3M(ORo%:$](E}ŻqkFXBygW3٫d eRw#pYw>|(/',~}eޅ\R3/ U,Fʐ{T^*وʺerr݊ǥ~}(uk9w+-Ql` GcR>ZwgX3Wj| HE~2MX$>z*2-USt  H'Ƃl9c)6$0,]3‘)*h;vfz)Yo 06`0ˡLU!eEo]HC9@JgHMr#FѕcY!8B=@v$!3A5zRŒNF4l!P Z|XmDC=DNr(2p^<2Cj)S3'JRk=$=Tf bP֠CDQ}Otщ,NƖ;db-S-%T-r=w`ʟ.7rbz/Q / $c²s R.#&dr|`Ou)9ךMY4G$%΋!gɂ%y8,I}1_J]V}Ϻcّ(:s;@V}!G,{^ * ;n­I&ՄQoTXEAss:=p6odVZd{~ռ7Oߒ֕U~ވgG=O-~mK~6ImO+~ϗ~yYN^66[@hqӳLGy̧K6VܳD3fݠ_[ IW9y#?s'z4Z-jTVCT_ިԽYlTU9jgk7G9E|Fz&]v?|(n_O{jys_ kaBq)mcŴGsca8v:_v_~eq&]U^t̹mrE7u~OmXj7p]Cw!t"8$Iz75yҡN`̒>3 /oÇPJrؑxisĺY2k>0צ &0Ꙉt?V@GƮ{l&uHWd] Uj)JD{K8f#H&剜.|gNiS;4f ]@IK<ѯaD'" ߙ/Կג3Z'ݍКLT˚/X"04}TJjza W 2 Ht0ArJ&2aF>v})Ҙ>!K42Kw3=]Ұڝ(5ƌ?ћ5%қd#mے>xp~n]:u+Z+K—u7X'3/ 4TBg'mM;o]y;]Ѿ Mla4UF6i l "7ij}dY@t"B)aDCߥsY ;+S ff'!j-XTX[nco9x?WWdӨwq /C.~ro[o0~3 w">h .8bxňPI{! ̑^?>L6ع\Ї?"Ѕ[=,} -VoNV׮u>|[>G#ecUafД1E$-fK[O)|3 T,,{lC6˗mfZ0`>$8yIUz)@JY4п)eiU+SԹ3QI*$+Sk1SfRDʭHBfd7 RZLC֩'U6QbKEe{ d MV@X{pF ,L<ˈS# Zf*-nʒ:DrCc-ޯP)?CGv#p]z;oj{t_8q [|A|]1u.onOQ~_ɈKo&Z?:_O{2|+oegsv\2͸A,|{*̱ŸQZǴ7Uzktw{z}rCoU#O=L_}*nθ{ANFC23r_~6D5톤daٮ \*k乗4mPb51=Nq?68\JXq4s<fU 蹴,5ddk\:iBs&$NLel.@qEi)w3j<, ߮6GJJIT0 HB*XY(WR8KD--1eX}pφP/\6'q \q+܍M\3Q30J0S$.OKh x.֚^Z&b́I}zF5T8/T-6,Tkij sŮNU威 sw>x?&'Z-K?m S&s$5VTd* DgPy;LUZQ:'I)Ԇ:SjqA+4k$5 2t!24T4N("MiRSde6:6sT˼h +s&8D<%J͐~qq{JِI}/5[o_tmaZ*7gfH-m bVV|v uDd6#ʢX mKpyƘI.CƽAr6672 T3,m]6 {3Mq/{'ۣm 6<=c7t8;L֍}DZm ,^G K" (<* EL$h "Rh"Ո>UWIcM^RBxD_yIjx4\JH}#҉Oy{]zLJJXxYT'܆nD7댩ө b5w$ 7_Ƙp:QH+ߒ^ԋ7f@5>]8HAN, =9) ;\c:qz4}{\Iz?}Af(Im܏&ղzwzg0kdJ>xrD=gل'jH9,ܦLnn /6|t`Wn=Gņ(v!aQp:QH*%gMH29*=iGEM֨H%7.N5N)Ř'&@*AZs*<TBO/jfGG J"l?HOK>ᯇpSiզd@'%Pڎy: Lէ)׏Oe̿cY_[W/Ꙑ{c?o*e J3h?Vh'2 5+m7ͫ3IVTޏeSyC%ɡ?xA.jrOYĮaR2rfsXU׿\1\{(݃n5HgٻI2ͼ0} u!w7myZcqw댯- ? teZ*Fxjo­/G|s{{Fje?4u!@wEndJ3@pjQ]Dr\\ЂӛK \=u0^{гoL%8>&JAsM%oOzOϱA} ԆnS kI}:jG&_z3UfӢ&RBctЦH5F'mP'J=S{Q-cMU` ȁk~STt;(tƙ~S4]S> +sRH8noϧS-9ڸI5qP:hVx=/W/v[-l/]qmw~4ϺE *5:'!| AQUIl@AP>ܭ.@>6Myr <|S+4Wi(imo|mu(V,Wg-WT #Y9gM:}[>-0^U$ o 竹{/I5#ZS9(K$6@+q}oŌ! &9z~/(fK:[~\''?>U'r~;-r2]ǜ \( EZ)Ff<7{@Oe.(nGӚv]>Їω~T[=+&n5Z9Y]G'Vf$tlSD T͑` J e[3 JJy*8-V]lC.BUJ@[_GX"ϕedJp@3&Uqa+;^8 5ټ(r6IJV6Gr)BVP*ڡAPo:'ýQzP2 Q6p([u)"~cfTEQPeC^ow} -ѰFJ/7̾|ƄP$7ʳά=RKj!UR aB,&:%>+ƎgR %CDA!x.>4@1|5 !s*ܚL0gJYUdJZ1CO˹PG1̡c$ƴAQPe5p,Q XK D6QhNU Щ N_/Ư9$SM 3Jp)Ix=RJPq[hfB! [f&뿿yk TGouqZHvϑ};c%vv!h a{}[|͸Ήg]=Us>ǫ]9&Z M-å<_-}6=0_Χ޿o(oz߰=]E*fjTFg+{@@*܏Dk4%b}T60SQ<-`ϟ>֧iħ[u4bJmW6e{g)k4;#j4 Ad!zQ@,V;} пV²zw*t(!+>DmgD3N{=zC$mo6|[w7jķmBUVRkruK~G|{ZK=jqaF );_ji-;?^}CLJTM M%ݟªQۣ?FOhRc(gښ_Qeh\7z_^N*}3zmGgJ?%[u1Hmk"ݍ]C⣦Wߔ-Ŭ[Nc!ṽf[?vkႠA;=f-hh[#^VuǪmݼ9G ?㟵iܪ"YGU4I޳nú FuRĺ/22uKO>n]h;W$:yߺ gn_IhWo<޼:ͣ7MSQTWQ_A)J?)eD.(J\s\eQ$<EB 9H깄I"%P$p@$ܫd$ιNI%?:΃mԝI `h NƍW0!nJQSI b`ߠ`XG|0?0YΑwvA1D+ KZL8~ eKZ=w>xGߗ%dcX|/(}_z!c^qKix>:0Qnp/Il2TBb(pM\Wui(`Rh L#BL\=aHN{1) /+Brb9XK%+t#\+Oo+TݤdVtPerXs&Z"P"$d#IQJ _.kF{>[XQvj4!ȭù|݈e0gc~ފ+ng| +S]k潌D CTlBףa2:BTl"QEc\s3ζcz}+. 6W^\̂VD@zTƧ>3O3t}`^,#Cˉ3$u 6GmEX:Gt+^@G^9@dك@V\Rq65Zd 뎱zuG݆Ue hO[.|wmP:ɩ 5pTh 9SNv/VQtS8E:+%FÄuNH g;1O.;W1e"*W\ abܿxsUH‡ǐ|M~gߗ1>t;뫧Il)^]wFaЖ[}B tk ;ȝ}H }ȇ6ŧ)-K"4N&rc7Cvkvo}I+ګzLrߺr#)7wR׿c.H6NZo~0'RĻ#2E9Htޘ˕"C/Y&z`HzH@w6=peIfMD@PkTwrϮnn_Z)xwu{72ƾ@ڥ2t/2ҎьJ{{,UBX3D)v*w \h"HNQK 'd bx Kb ؖɇ@qB+Pv )$]pcH|FOeWJP2;?O`h}yR%h^ŒHPƸ$14hLCs⵸DB19t9Kk@ wX[Jg}H8aLn:)oSש!˼\yAO>y&5ȱ=g8Y~[Nf@-<~ >n^q#HbLlFf";Cߔ\%}5 GK#6qnjS5XէJgQ@Z A'WhxN Eq%X䃚&-\c?\ZZLqt tb/păLj)4PFzo:#7eQ8OWߚv6M{ ?^cVZOw}[)˿tk Sx;XZ`FQԑw;}ߙU@*eʹ8`k"c]{-zȃTm67eS!X\->D)>V~Mc]m,c2~:WR߱{,cj4TVdztk|-n~{|啅xLڳɭ*\ߔu!\EStJq6zg8gnL˯If$\4LzЫ7W_"^o^}N7MG.(2LCvA<ȏۖa~r ,)6ý ViOC8|:(M:B6:=)%p՟j0Dd3)^7Niv2F_= W<./wz_L{MnޑNnh)$p>eSO@05FHؽO~SqٕDJ+G+|3&+. ߓ“HyƤBo C9GM4ֳ%LIZ+4j%"eQy;8 R@:GB TH\ 9!qe4@ : un~Si;pLmvd;>6MkI]P1+qb#Y9v*68B5YHY#*&!%!f|8DԐ eH &$(Zi*b]X)q3;P*,DθS+TYLm{ƭ6 LYAtP8#͹& M6k 4߽R3b2ʑ2^qռ_a i74bXTZU_BU96XT:k2E"F$ۢ$HzNľXd\bǦEV3 JS :{6TB`F I~_ UDKo]T)5O!(i2l=^{uј7P "8!.JPG&EB|8qЋl,FE}A_Z+y(2Pk k*}u߸ثHP2DCD֋j#]P_@a/!I&\t&G΁gaI3vMWdɅ]Z/4-PBd-8PNs @hayA! 8)ƒi.욲bOq^(E.;MᡢUpRNnRmda<_-`#q %[0 V:d (6H$6T5>s+s(yQ;\>tӲȔd 1+'{  AA%Vo! J* 4ݏFhY^Hf]U$U8߆0TK,A ! HNv:W 3NoNFHE-iIR(Dv ݄I& a m('Ab1ˠ!y"څ&CǩW8 ){`(/pJ%aho$ɥW!KQCeDzBde;$ZoD7J`u8h0Dg>"Tc d<8O;ZM͓}q@sHG *Ì=as: Y P0*CFHJoO;^m~fAј:q#kABa,)^nq.|5x hf҂P>4~{_\D%l%'^OQ[ Ꚛ7fPCoyy، 1 s]kWxQKHvd{&3\dO#lM5A` o pڲ 89ݤF5yIO< p}OK wws~e {gw4JYI렰=̗UkW9_$J;"I)I(ܫpAwc(zu$v.ZKϼN8n&dW'cxO]L J^7rF*fT@\rau*1ox{@mn͢uoD\d%Q":TN@"@c+~Vbšʆ#>wFxo )? =K/LĢ1M:so~ݣcTqZnEaFEISH8{#"(2n*ɣQ5~\aoGָh9՚Jf )gM. =Dz@WIyܬ!q<$$H+ FDάS9\5pa rq~PTP?-B5gBW&,0xjͺH $^[@Bi(  0̈"}htX80$B\Dc|{ !T J(G*Jrc['5uCQ=!=\V=hUceE{YTpY[C5tÐk;ia{ G2d0I`q̃P$H1s9*e׆с7a )d00!8aJXȐR , X*HYB! !&,I bIg$!$瘩*HE@pwD +,"@SFΗZQ2JL%y杛=r.?PQ QܦVj.xJjfP<߀9jsa([]E5̑Gu.7u.X,F#u.*"* wP_Hr6uitkv[>n^ns|~۟"?ӹcջtqc{ZDhǽti8/`o%lꤿLr3sxtX&0[czf% =sUs\1XgRmC y){u(֭lSUBz)ެ[Z&)ZSm&ziZՁ u>u֟jֺ5WN90e;C5+0_pR~H"/RStzU@)\sw*rz}L9 ť*)EQ8ky-O`n*ї LjVTelpv'`1¹.@3Kw5{#=jZ7Ozidi: _w^EUA2ޕ`oz?Il27ٍDMk/I\n޼.ś,ٸ;(Ѭ&u?ڗ'Y-L4Di2:J{XgoPQ2hv*bl@dpp ~ą ,Tx).9Zbt+BAD$l!pV Dip81 )J%T%2T*7Ke, #Ø&DFJXUS. nLHϞ_M1,x6|9Htښ=H"ɥT٣BJ஽f$fFTZcua[hI ʞ~Q.4)oG3M*:w6b k~C"?X3ڬr̤s#1u\ݽ۬e"iy<]}-ZdTxcZ{i~tN~ʮ^oN9Im21J~&G6Ŵ+ށ- LCB1tߑ,(+{$mo^x鶕3xzDmij8cwCZ&ot ߞ{3A뷔kTK4ƚKn9ՑVS Hk`1Δϗ6vc}!b1D#QrhXMTL$I)N"Tgמ#8&@>qM̗[,]}{$R'Jh12_HrMpHݕ:(oy/!]f 1e.C($}H*SՃ+gσE{ 5y: `"wAH3z[y~|G#z+uwWohGm9O2QWic;-xvzÇ4\NLC vcQDe(tc6=ۤݰ:cI]Ȟj?R14rhWKXI3aF]͖RI1؅Ebs|ƴdNL6 մq$d?ZH׬Li4PJ$X4y!/Œ5q|BpǶyAj~Yqsq%W EΡPJ'ub C &к=tH9Oqͺ.h:r@#̀e0ԇT-fieĈ%bd"`8E֜-s<ȕP_iGRtWZevztH? Q?BXD<@c;DtJӄ#1.aDI'q'DHP1!X!b&=VP!{?+D!.⁊,(!,#P´|8!S\rT/ U'g]vѧ隰=z 룩=U).LdWU0ְzmyXA@l?Go7څUɯ'm 6h~އ_x52W6^EiݹvU8Hr8X,Z7 +B ,/~z ! AZ[M&^f`\b$Dlzg}ӵvIWD{?\!n1vdkl I׽ ilPGxkWR2F#|K1+.K s}Acjk#r|5Y%4_dn_n讌4|R;ԣIk#TYM IŒPɐ?X0M< C$?Ҿ2Dڛl[qc.6S\ Џ:օ\3"I!f[qrfFvNO#V=~D.+WkRKȁq U\76.WR;՟Vq%͛h.9%Y.5sg\pTp%B)8Lމ(vwk*M@Jon;8VwtuP;R0<6K'<~c \CDyVJχe06Zj 57+3;WL]V: kc s'+րuÃҴZ,F&FkM)=*4qĔฃ1 Sܻ?7CΙ~珈ǔU6wg}MCY4 El6&jLpRoi6=ۤ^zcwu.d7B5"j#]H̎4PJ' V@Mk+vBBOM߄(4ϿQ#BHS=E5ĩ!1'8^爰S/LƑO+^fLYu9 a"Fy0LFJȄxԳ69gJGf"\:z2MfAL{hXyGOI퓄 w=02}b 0=qX7Uz?F%P0^p?(HMתj22+HB/' I= ;[tSLQC-D\Rcsx;M;,EĎuM<ز7 NCB[RXJ a+g)Ȑv 7RǯcT\Y64/Pjr%9;^#K!c)d*Z2WR$Xa)eRJ o,;^#K c)o&q;l)v,MWRJXJɐ0)rnj/RJXjΖ^3K Ku)KQVr*v,FBiR( ހVJ;fR/rbbR{w#$K1ci&@ݮ[R?8҄4lGbUr/YJA2ly^;inQe@~5f,h,@2m Xi\b <`U[=  {GFaVgzmAzy+U.3iuρ16 tB&QTƄQ! `!4Pȃ i/d> "B($ C 2ueqݨ gCJ[ `!/ͼBǁ Ԁ3(o0acQ` Oyǡ9DJ*im`(ne{0T`w[Ë/e jTW!fBG48F|ݷqV*u/qӨR;K֚ʗdgI r;T|]+8֛/TgB :@HJ7dTG d{70r/F-u"AæQ6p$L{zpTKDޕ5B۾'?rR\߼hуG:!xwb$/okԱCtC<%B9gVJh52j>isiV&)l{Gm|&mBC}腦W/jntuS#H3 7y!agv}@EC74鷼@1&L  ^%yYm"1*p>Xo|f4n%h6Z RzѽJk[bj65lm_#bItqe /֭B:tYE"9)A*FDž?fa_jh0v]gf)ǐdA'zh>➌X)m4.1; &xFmB9oK aPmqvhoqUK{y:ax.,}SBNzdRJRCw':6rELh_iY 2$Jڔ]*Zͧ8a7fQhhXvBTN,W6{e'1ou>S#'hY]bccMgQdU e{4b^/<59LZ ZhjY&tV9׀Ƥ|P?Ɔߗ?ˆa%z M,W&E 뉴s>c8|֍&YωK.Ob .ny;ͺ+'_ ZOP_t3#hͲ|8U/c5)ϧj-h|b?Z !&1X#GQc!h<}L$13e;Nu{{ݴTj|5ξw*smqeػě>`< }1A\ |фJGҼ7.r#DC8t{<ՌdV+I1Q:qIEYʌd|5A=jO>O7eZ1@N}ؿ *#`֌8! v8E@ d* TۮXm5nK(8p3lB!ۅb6,C3[; cM1`|;4ۉM :5zO)(Jqg8, sevTN(;VsuF y,q ]# φ\8vAtBoEf ฺ؉N$:EYSQ?`>\ (G/ Sm'dX,Tk 6f 0y( "\@IM8{S( @c_\hF@Q:(;ȇ$BqAcٙ '4փV=cN :DB~${ +QIbX{\13H %҄JcACeb.dnסHC}#B 40  SoC9sڇBc"!<ٱ_wԟ0&EF|)orTs2WxEH%4qLXbR0v0c ɷ硺Iܸ~15)N|a<<Эk5J|݀4Isf:q1;~hP[e7mɕ2N z|:5ڋJ!BsvSKu6&D.`Gff;2Ò_=@{7ӊ)wCޫ !;߿._M#?2̸O{>\i|vy3bSSNsQo}ޟj4?VGg?..X|c1h7Mh җq)3k[Ȋs/)-5O}|B_otig"kd.-XLl͆h2r'rkD.[ }y{u|c̶xDG5\>;y: Avj)^ن:g0~)i~JTŇ=v'K7w"B4zUTHVYSS} y82=?E,,>?[3KwB|lYVcױ!UGvR՟Ŷ f ~=ZM~~*kki:H?vbIGOGYd,n%cdWeK7 ?Ŝ+t<,01{b ?,Qx8Md޻DN,x/Xq4[dj:Oןfs!h7d7Qu[wNL`L8)vf) q=J,zKBf‰?Om3(vĥB$[if:d'lkQCF>OYfZ.&%aII-עp1g7kU]2DVdy&N+-.QT6?ryfdaS(ioSC*)XxJzB!SH0xA IgxhI"("=@u:irЀ0Rē!P J S U!JxʇxTx7yPy6kQ4Ȝe,sb[*sO>Uvd::AZDZxg?;𓲇Uo=4ҭ`)KK1[߿y KN̊(pn.;4+y=ongHNL!Xvoi} 436w#!BVoWCs UEFMnvׇqG@eà#KB.ZmL]c[Yi٨f?x^i J9~W2un2Yxъ.vTqytuQ-K!ͪK=3+J;-&l}]j./:\Ulil؎4%f,6f>gW{ՒcH]G0YMW,g@1J\j\~뀑jT4 w@ ?Z@2P]fw HYXF^-AYך O?QVeZFq-L-Nj֟^[zÓ6*I j2DBb% "ϔ=2a q0|DKLc87;ݭ$N c6 nEJr*!]S);Dq.9UOeNZ3Cٞ[ڨxutN/ә|7m;yyd0%X8_E?aOQzLN2s!,"#6>YR-i\@ pH{#8TG`d-yRLc]` qu虲x/vt< L H c+-\Y1%AsфKecћ=ػFag-ėM<嵥 uP)|"V~qKŮdH6\z~ۼ81gG'(&N!Uupb̑ J('%$Cϧ*Y`sB|lO8Dǒ ;yuARY>ؑȦ9{(Ň>)p'K!\8O캱tJ€0nӃ(2XRУ"JCl%m|+!%`H0i} !<hj#CԘBLA E<|60$k8?8~[zt~(ran_<, Cσ !` }0T  I $X!@'L?)!wĐwL;{񓟇!05Pt02KaEP?{WF C/ӤqLLDy5u,E;1}EJ*E(EhD D"LQkbA*͔ JyMQcaؖZH b$T!Ž boKR u~!,0jNᓗ0oA#p` ŊUY/+l23 G٪IF#q2C͇|[QV]6A勆-8Y(w$OchdKdt¤9}w }ӐXBhӵ:pR]!,RZۘ>ֿaW}dY~t]Me9=Z:i *uՕsӭKjyHTa5W_+?(.( 翍6 LCV7lo7TC~`hw,vl@ ʐ(йȓ䯍dT']s 'uł,r~yj<VZkMdnTwb1ń&jH,+!8:ն" nS1F}zcT=n[2'vcՓ:!)=zW1ދuxU ʟF@u%O>9JG?ξ | \~>E~VԧL_|VoV,h/f@dVe.6ɺp|Āwc)R$By1cII"pp"<݉+}&E) ( fe F )l|*. w')^1 S&e4\u0%F) /TĈD`Q~rVWnRO6<`SbF3,Y30whҙF*CY "G XUT] T')p9$3N艖(FƆ@hKQx¹T\˯PD]k Lm8qhTQ h0;Nv:QHk#H<Ɯ)Ē!e45B)FQ$"թjI2#*E5W37P bc% Fm ed̅fIFGQtJ6upbmcd2e6vC6p<v3མL(M^:*V>xd2~T<ۗt#/a^p{(2cf/J08yQb) 8h|Y.)S0z=|wgV兞Q,_niUg~=\[QCt68';Kzgξ-Eq_O:ߖ,T+Y8˽f,P{j׬^7Gv vUx+Rlw;T !} ۭȟ}n$RS(=Dtj 3JOn2(%@!Tc6J#Rs#ZN5̀C:GN;}n„TnQ kx1Q !'bmQ0JQ=ݢZ"wi:Rk%Jt/PJuiN5QʔJ(;!TcF>GNEs oTOVHA6J1wC)V"!{CTOq2JOD.(% 93=QʰJ}:U֌RPSM0nHOQ\rzRKs 烿Q9r9ٽ*t#TO&v鉣T;TfL6&l;<Ζ E,EQ4bU/FrXf=/,4d#8՝]>7a*6\ݘlUt/f9fooMD$F$/RفPLbW,uRBE#ՅPj  n(YT^O?QHehՖ sH2oSŽ;۟SM Ցn,f~rq^D8J3!q8N38^v;#:;XBmO,MR~Y{4~sў1qBBEP{ێPŭ\DX&:BflH ! $B1/!cJ!BPׇtH/(MXEP[Xm,J0QȤ$"Í&‰2+d"BYB"x"ݓo.H3ZlY esE&q$L4gL'Zk۰4HciƳ85ZRX-I` `,cfq$D4;Ҧl$6k4,PR~(|%6罓>iCqڜ9LskcJ81Wzus ];aD(GĒdxw"…\/9 ?O~N3 .JA1y_(o֛>_ِBEo^2{=UIt*W7x'~HWB2%O琮)bwAu_`؉zлs˜T4S M5j“,K.J gˈpŵQp#ʌ!( JaFF65#4ZF)EL2n8(&KL Ts!JHh" 6Y5QjNTj)GQYD*4'Q$VFFiHhsP#J͏/)-M) /H˭yǐە!te!΃W}OǕqYu}o?<-p0&>%ӋqKŌW]^/L_g76ADl7`1,u1Q;?gG7oL1$PEu6G˿؁10)~mwGVXּe'V>[t fR2$}E_n;9AH=h_$Ʃm#X` $`{0#R<h4#% v,N<; 0ENP;˱h8#4vDhy5`dIfquc ʄH ((1Ca2QSacLxk~ƖmdB5TjR~(;*wTk0XHZʰֈ;j>ww@>\bsQA2mOL  oG/-{)4֖J Ф*ְsk96̵Nc".i1 o? @b}1O+rtzO@zOn#HD4EB݊4vNqt6`*1ٞv]g3 ̲={\L6G/7uu]V*/Qd4ؽdTdH.MZ]0c&JQ0MJC[ty{ӣTt5֪c9(l16 A,ZRsɯ&Z\~[SJN|x\j_Mҋ2ZsqZ?MF8dqyv|ό~Y[4'e%EҪ{%WtR,'DRD5J򞭿c,k]"Mck{űޙR'c].X!>} dC tn9Gsǎ3Sܓ WOGrdsߝ CAdAj{`W--S*η/v1z ?cEй}yzg$zXf2d򙲫Ԯթ:V؟e  t%9i'/1`ύphjYDD !!˜~nB:ch/+SZnIqڭra[ڍ!\R :cxiNye{-hv!1*)zr,zGwk:Z?8wGQٛ|Yƺ\ocoO^8N^Lg5UBf\V\ N&:uN8'T]>g"1TCA3jX.a K67b>Vc^ܣvuEyH`ZVO.r2ꊂclF'Ŧ姰wWuqcH @/=I&HDkldcY"'H(Eg5ϝ{Xjl\ēwH< Tk&xZԨ2W=ǪG2FiE,=N@(+pC5/?:I 5U>7iU+SMj xZB^9D0%iw]Vr2N$Ncxʼntg}L Kljt!ARvÔ^n$,PGr1ձ ÌAf= /+e@HcqdcM*AbxO XHt&O[>"U!cI@%Y-!J.'>]a1U~-a03O(֐d&EpC449/'pк/PE T0h#s…h%i Au{' cj}YRaBJt6X/K.,reDci2fbca2JAYf1ꂨ\qUmET6Dt8=cE:8erjQDHC(STDW7HB!Zʖgh\бuj%4޻ʴ>#͜`-( .9AQQ!;2`2[#ݩ"V^.,` ϏKɮ2^PQ M;|?};xn` >-+ 誸/HqƄJԐ+#O¥HIQ(S<E:imv6?G>$lfoЁGDzKq3ڮX'/v, 2Z*MriX&K$8y&֙#J% X`gZȑ"n2ߏ| r;H0|Zl6;ůbmZ--K=@&ZbUXU,eRcA O.,PlgZ0uOPfbsmӌTQ&is3<& #Q\ԬrxCܔZD6gQ"é#R43`8Ĥ9ƥ֢fHuSwh%F2N4:=nqC +S0́-eęL{N &N#kPJ7`SfX1QEβ= 0hfh+1#LJ )q8bO9M0I;c"3[<<sA$%+GA唁W"eJ[T9&x1J#*T .4C1\`EpucV~׃Q!6]Z}XOϲuH— |뻧.&%3V̂a$v}o>ED$$k?1g{{R.pM{Œi_=bRHblvcKg<>ׂs [Qd, @Wlv_ǻ;Oiu߶݀0R4gBK.'ڱ] v4jG -Jɾ/MbrP%B:S11-(W/-XKWJv"5Dz&1BfGE3M$nУ7:*Z+5 uT! ?Bn':QbTfxc3Q)u`oӠ^'9xE; 6O͊_:wV"^=}vfk%S QS,/vo J'%xH?mIk~ggDt9V'`!ĪNAokMQՎ>Qj>.l3ǃ÷^_ 3Dw=wsfx5]{60B;)d+ET'7P(%0.ejč̑$)}WLP$3Ҡc0 bDp-42gD"%bƵl w$K8pgf O7\`|Zs-Ov NoϣӍy/yAIz?_pLjPsDl=K̕n-xIOхS-uWnz˻dED19FT+"pFTejE+#vAH+w;Mʡtz݆n]hwQ:%7-Q5䘽#y9zxr4\/\;А\EtJ{֍In<QwncFVn֭ US*P) 9;N:r,$az ÏBx:]rz+s,{cq{ԭyf=V%-V¶ݗ1Uޮ='᳻~a_k8 6 a,Šˢw=&2VRzX~3Le)/>z~2#OBAU'۵('gX{T!N1ykN A25%!i2z|p=xx qrPBF4oU6{1!ނDLc$TqS5w=?"dhՅ'#&Xq!Q_|@5۝(&7Ǎ{厦sgJ>YBn~55_GHK6{V(a-J] FFSX!sLZB9F\\x!Q+q,nY(T6)`]XY..E.frrhBQ|445[w[7H>/lry8Wmbь.za]P8C(OA$v)j;^):V(O.P",tR`>ҀBn*WBSDBј;J)4r܁H.9轔Atpj\.u]$И{NS /VB9kʋPIOh!L.&Ȏ4A owh]Ս#bЈ)A@_B8[aPiK <xNݧ2yٹoloVIvo?%b1H_U#qoM ~|N}/9->Z h/6XoZ9i JҔH IrA2YO7 p2FĸK3p {`C"êQd"}Iuu!]I <ֲmH'[ϰQkwoV/zM1(qDQZ#v H%=K8FE2>X2w-sN#@C51"ZzIR` ~~.; OkwޠY $? I t$&_yM|5vq}p̂*-);R#@@"Zrci$-MZG2|>2yfF%#/z3++wb)+V]*:~ͤK VXqV'(SHvwڮALJP+M u)#MrUʤz0m s;M]GzJrgی2A!X`:1\T7 Kh+¼_iėҸ#~J-ek_#>#>xD0?$?=}~F_Q,xpG]zx TO~rU c" 9۳ۋ(eNjTkz61=9̟H5A9D}_`,?SɭQ?l-k1蚽̓*55 'F Op՗u l)' ZHޠ +55S}YK)_Tj.y> kHM*Y]n-Iմr>x @.1'o <47}vN.W5sX`|Z^=<qQhs?1Fa\BB.S@|B.}R$ *ǐ3 >?[I+P (:Juu]@ʠ <@߇)DJN'Y`6k*Ѩmj!nŲCܶj7S!z'qI2TJnd4ː8JTJ0V1e2l#̥6Lx3%̑Ls UNJEf N])K҈Hapm@GqaRxb"?a&zC!,ȃtʱl24cNPb+aFP8[΂ȜpA0ώ#!UL3EpPô<Z[?B9E$+TؑkMo1Q:ϰ?b 9ZS`R15S= -u`o㠚38;8s*KcIOr·6rOb'AN/Ln_wY~Z wXyIEK|_ 3/F9f)8hmN\ {,9KC>5VӎӚX(H;q‚ {м.:%*?83:uU焬|N,BYHLjVQǬ\#,j6'LW!:hrD֦*TnT#S9M 9h3jM U a ȉAJ|+LU9ey'95VQF ! 5Ԁ 0ŇS;.dGڜfJKJ ϊ#ž.aU4dE-̅R+pdz8%1Ίz1r VV9 'F]VԍM’ai`b?7OO)IX|]s;ZeVg6;Duk/*jjVkVVRvR1갉QkZu z8=ac}rcsVSd(CGlgWòlؾ\ zxW+>OP׶JDQtzm9u5]2]uNڕ( ;u/Ljcn tlD`&/!GܢÃŽDhQ4A|/uw{[ʽ[k_:=&JKhQ}m3ˆS֊x"9r( &0k,,aXa%>ۚ9n4BU[GjH:ժ y*9T(KCP4Ae%7FQ7cS=ZR_*5%ur"ZoA]n#ױ/@5?ٟ]˲#) Jd$nJNM%N"R}@< @t(a 38Y5 %'g|6r VEpԉjdq~h-M*}N(BT٦\F)_3RX+5el̆ @+>YG9'<}Vhz VdiG;W{t"0BN\H T_y\` U6A~\icz\Iz}Z譙z\\﮻c\ +;=W!e\ oy?1@}$ 卯Dwz/3ݴ P 82pI"kR7JFjIhy:z캕-2~ſ_IO>Р8v|zѧhciTv֘ȱe?4o1׳n53g< [J??/5GԆ"m}[~p()*Lʛ7CLQQ5hm8~Ѩ6c0̀PXZ1QNR0wBI!x6oOێ{Ffr$;yS&R +0Y/*RuȥRp;Ws:{~W0Z%?϶#:2>=Y}?+ |Vy% D3Z M!FjOc +> ¸@+g VAèsJuЇߥ!c}u|pN|w_Ï1UN|6z63ާA{C9 [Ghw ǢxRo|kSHk624sN1,v߻䱆H*[:swӕ1✑nM0%ynORM0TH@R+=YO!]6DѾb$L$¼)VM7|$ !]Jf]B-A@V [ Iikpt.^..E&عo簬ހ0.h|:s(y[m&K@Z`z=HPULEČ)qht,Y^5ǕzS!jw|Xt-tKיwMIj[阳"?/s)Jy/#q'Z׳w\a)(({+I*\2z~5W͈lSWǸnHU-E Cl{c SW4Oqٙmm!Fr"k{{"E%V_8jrڗ&꘻㏦5Kx{b\1Con??-=k++$92jԼۃ,ul./vX|Gi2& ]8D s.|Af;z4 *F9BCD潟z;x#̌Вv\9PѯzhUEL`E :C&ߝ4aɸq3U8duVu7zϊJKP*ѐa(֡T/F7|ֱ2QX\&wbgJ1+A]hC&(]Yo RNylk'͗P3).%DNc@V:Rg uL qg7˻m3U\`l &A_=(<&db<`JƢRO^ӵZ BlT+%>kS|vňk> 6Dckj' JNS!|6s8@&Dt+"xC  }y&#ówۚ14KvϹܭUٞØb^umy$bLIh.:1p?^?[{"T+|k ͤz͝K9 ~6V^ԢMNB,pӘǐ]ggTvfSS=^_s،_0PJMNG,Xy&$ˁ5]iWA5uYkw Z[h%yUZdVn~.~w&72mʢboÎYns>ƒXeu= 3n ;iђA}YoĦÑ?(-,tNxfBbpU:?,-Z.ζ|>+<<1ZO*U.U3&+1\UcOQbIPO})ɛ$r".IS8p%߽er3cq^kvW;Gh)*堂^@]ky^~Z>@>%WdbBmŠ Jʢ/K`|"SsJISXy kkXcIPk DZ3b0`fb-ڠBOԆEkk kk6Y+ xFz~lGٽULbjwwjٚK@~ST$S2^=>,(ir'߼'ew:PA\LJ^PL9o s> TP3a2z8q6& {Z,EL"(OH(LzVVN3%>^,Bծ pxe lc !fɯǁqdNw6*fގY>4Y@(9ZYb*dkGcBo$u67_-uh!S };@A'l*ޒDR =vge]}at.W]YoDz+_E}16&9F`ЫE"ZOPHQ8cӎ2=Z ":G߉ Zc +S&2xTi ׵϶@V'nU \;Dh<09ٳY$KWu}^rl&A,2)m01gÈE |[#w*a[˔B!8?b;^Hfɣ X1"S | `\VF [>em-,Yy*2I5K1z7%~?n"¤2ʖXLJT _)<-cf%aZJŒJ% ?&0 p!8jXESA3Yox 1vE/@m@R"z; >*jd7SߍGtkd]%)c(.\8YeG`m2 qXW1mHڔ@ZH7gҶҙ UH<O,8t97=vRmm)5&m.W`ց\wc[deF/+<.P0>fz{imsq?jyQq̠~JBJHG>QI]/ŷmmZ]'UqJ&^XH,CθIg]aW~'[xQJWlAALabt 븜-,so`ԭDO6Tр_x"D0>[m)IWѩD;#ߠ7ۊ_ţWv6JvARxL~0=IR]w)'wwm#MKIGpZ<`@s )1AE>㸖F[mJhOnz^Ec &ZN"MRھ#Z:=Œ(&px4^.pYB0ZɟyL`>Q%iQdH7U4Q4oZ` d~ ؓ } PCPFjgt*]"Gr_qcٲ퍬Ɓl ކZ_/5 X#lq<ܤimj0'y9`.ϳ<~8]?.`v?ϯ=;8`9˞;i O8ܾ4cxE.˜LB,}8EȦQ!Fbjx Ypx%!pȶQSO/&@ܒh4)5(;n`⒙-t#\ih:OteتUbR 듊ɽ^Kܵ))X9npR>s[G> qi/#gLqNgh&6p !))*>fBV1~lv 1)$^dzc^鬣[YpN45XMwcU`ӭ6Vh+m׍41+Kl6" T74M!NW3ևQpK(]xN˻P7aJVEQ(z'*? תFQRs&6r &i稴FOtͩ$JwÊƆj 3ao Wo^Ǩ}yfUxC>n o{1k`).e@03@<; fk`|>M'yׁ)FrR+9fQ#5N<5gl,f'sB Z2Ώ:LNPY<΋XޑEvM#*MUՋ~$ `g'Q]i//r e~zoix߽P\KHk`a\_|so9cnYkӖ̭/nG۝ yh1_L4nizJ]q$6̤zA8A7dӓCR( 7O/n#Y\CZ`#JPtǚ ˗??/\Hpͨl,`W+U$,])h 59r)$#5>' % _Wg)]D`1-hx%|X\ß^ C_ H~¯<|`vz"s r1j(LA*TM6Sq؏:YE?-o)[>Fr̘7a.Go9!'S%0M*"z޲D36l^^{\s+L2a}XUܪKe>Txp竻*3ʫF6qCA;OL}VC|EͣKn ]vlV=VW2xRYŲ f ``{(LS2BuxKN٣ kyP_%XC RHޫwrZ%IXw)eZ%:Mܹf%`1#`i"aB PL Po'L.z|(|w߽O_`nEl߳c) $:=ŋtBNit羞&6}nF}hvw]~?܅9웹UۏFʎ9Ѡ'κ ҷNFT:esnC~Y.-Q/]!Y"“^q̯( vz; z, C;AVz;_|` zJ(+B~}`yp)~7;;;k:wQggB/^],I^'XA0괺ĭ|N%5VhC֊qYǜISPl1?1wy <\06R*5P[#\{/ 9۫kpp*[;\>0d-7SY}YUePCO-=oqӎ xfVvSF.6 da'Uv!P-dy?V ${BźygSjO@F},02w0ł%9'aQqLDT3BsrH*qN[qڌ˶ąՠ8/¾`l׶ǠrR9KU&Z% H!4:8"9|]Kn,5Wa$c?W+UgHuw`lԲM6cT9ߡtȔL/?VI9$i (FEw(oXAK!C/3/Ez<M"8'YIQ^fQdpE1+va6 _Ř̆߯SRECedRDza-V24uJnD1BP %S? ׽~H٬sq0'@m>ŷ g\ꍜ]:SD|"hKԢl]=B@<8_5svW4і4MUJ&YT2JCArڲ`T:x;bWHN\]G۰+[Mp?GՒ^Оe-2N-'l* mc*ɂ YBX̪,xSWg܀L?$# A0m2zK^NeVqWFT.WNʙ'AXYE^$ZB45Fb\[WBj-\fnhPsF9E.A ĥ?04iVej{ȑ_ven %^`f@f6~IN&~dKWjɖ8.VUdU=:z0)?Koe<*0$'qYfIչ iIAIay+?\p@ bFHK9G [)GVaPZL_IBJ)ep9[_G\|z^/꫱Ӳxfk_B#J&Pe/Gc/ԿLlP|3 y!.ᅿn=d [0}R>*ګtz_#,8|Lۍ f2ysv?U^eӫ6/Lgl'{MD.1CfDD#j* BJ ;!4' D KÉy9-q3AiF81l" 3څkDܭZ`TAYA^,x o,[겄 $r:ӭӉ)iZܾ`)e1 Pq0qP'6ZH*=]bf_r<9ek17._A\loK<~N1#sqf,)3$򒖙v:7ɤAAe oF)S߿t:psmnfr} | BiTz;pLDQu`ԿctӠmޙ#?at_! ;9yTFqIF,̴~'h1Rxr| L3Ȳk!Cノh@ѶwB`Lrx?~N+ټdW=c#GBfITfJ9"ς>neWPlspj$';p[Y QsM!81JP[%ψ ˍ*UБ N&< $+J m^55 d1taۧ9PAێNE-d6ȄGʄ(Ȏ8 =~ DH =^n)1jr.gB T@0,ps4w|hKCm`D,urqC+nHI<~[!SD)8'eHqB-s+E%N\6H Q- rM2r Se{a ߏF1X AX٨&4N3. Eg-2ZvŔe Қ6 WgT*wA:x̱żAy?%2}ˏí#T?~~ >? NL> ,ߞ92'jbO\  [Rș|:{蒵~r"Tݏ1G&vonܽ6IKBt$G_|+oM7 $^i F%;6@q-9wvH-CA)Vv[HŇRk5 Ɏ6"0!<>c1EeևSGݽ_Q?`H]v}v~ BD !hBX}zd elLqAP1rx1]d!pB9V`M[EFr$W.a̐e άVQ\ߝ"RZ|-&`%0UX,JҺ4/@i8ETY{r͌JcuQ<*5D"hҔ׃Rx8Rk(=me q<La(cE!pcɩB%/FF0T[X EĈ P(kiV8Vc|cR/ 資|SU]Tl<{;qu^ zd0S+Fj-<㆛J+tVۃnY =Lsg%{Aʼ tb>joxjVdhȄPC̝uwXJSȬ'kQPʽUhL?)~ࠚ,N5FCbkr매ֳt_`= )݁TqtlfI؊tr3%X@nF] rm m֖{,x;}vjk2w$Ef@b3U"a] ZoXqs|Q{7w7[%-8ZaWlM&sG4꽛d#^Ox,[W|\nR\%`6z|U^=Мiȶs7NcT6uzwy[MLw̑ɮ{W3R`Q̎h+EScw௏EYN^\3ũ/? *8ǗYa3cf?!G^#/Q(d|yC-)HZ0DHQtΩ29H1!,K^h"~y?'糋UGo 7R1߼5-UNg0yeGFj~(ͺ7nmh+We"]  7tU1JlԹ.ynys ڹnP d|CM nmCFUJ^ט:W !RϖNOEp+ 1kBA U*ǔZE IWltSs`jlNp"aS=\Z qy=hĈ5VB2~x{ero8v;X[ŅoYӧv9;Oί~:*|6N3ؔ^9_VkL$[n7oP!$}=YPs"%et[,IIB!O1 -zmH Tu=i>_t `Reh^L5+ i 3ƺ 6 㷵Rx_(l:4N<.g(X(I$u`dkcN%\bB%*լИdLjR5FI_SI$j5+Y ^[ǤI+mHLB }ϳHe1v3C4wƹE. ')M Dr v(DkllM|Km&k6&|l^ll]";9(B\ p7rˢf[YQ⢔VRr+FRjr lT@XP5vnJCG HQO5 Pcq͝X<:C`@]seh,t/I$fL)B>Ͱ^|b鍊*X!N#\#wI Od^x>~*Ď +YItɷ\Z "|{ɷ,r17IR<30f݂b/#CdH8;] •:H8.$\={H)PD8:ΩrJᡈ$4"8$ɇZmuZVERGZ=b/47 .dPG\0_"B $j^zk2Z7u:e}wXj2!r!rᷢ}ǝێ걀p9PMrxaQ).Uis]#ҠhTW^0\u;Hȿ\D_6E*u΄˷fYۭL? e)Jݹi[j-#v*Mܕ>1:/{bBCI p(mO)JMKQZ[yIKɒ\>+51sw\0V}?爪dHJ0R'W%/uGPO';68rHeQӍܬ}YtC{m=2v,J۫}1=b6cK33jxj(#mVde}vH%ء" nX#Q2%{<ǔ(D)K/g>r)O:AAg"VD,!MkнK9r?Z.#ĥqJ(i yu+ z5G+bP2e$Fd/!04zT*2ɧb#`.BN9jvɆ[nʶєɭ+ ~)Ő,:( R![2-J9i Q>0WT-jj$h\\:yα7 Q$Br1e/R:<]NT3͚rLeZ>IN hN3Hf, ^;X_6'l>Fvc\Gw0ޥh7R4/b(T2tb~x5Q]0^:!sB¨3WmF)rt$i֔3̪e28؞XJ`InӧIZ}>`18Mι 6Wjξ=6rw*g|Ku)I+\"AAiY[!}GeO2=S' aQ!%[/V6O\vwOS0N/wHq%"ܣ-1V_!׎s[:]qnZo^qnaOl/ vNˋ8yk^pÜ0G5V$SPt9TAY[?H(h.}I!k.>)_,+)D{Ǎ_%4E-smX$! 4e2<)-#Ҧ'gl1W%CV!#aʚעs{"n0ZM(ⵜBNF:,%Us:KQ~J{Hp$lvlѶĂ]\JZ%IuzWhȭl ϘHƢ#jn~WV%sx{]Vi4 4ULɞ\`v6']H<F^4&Vt yLA *$E|c7CWkg&؎IӲb~.:!2 5|&s6;}J'M".鈐R֎EfGӎnϾϧ jm=#vtddxO~ XA_0R5{,J-GS2nc9{2{,8 7 б-WH_~a7Cc#6lpST70Ԛ'z!LWc4lk4Yg[U] 6b 5-["6.QbUP ,$[bҨcdi$i$p?o)C6. ި#|*4Yȃ02Z59dtBzhc J0=5$hu5=IAT&GRL߻㩅znlk.IWwxh+y9/e\8Lvi,ZGB+^eHR"h|ORHdzGս*Ϻ"<cE ٳ>Da5VcDk1y$-# 3Aк{笓n2)r)' MGfcsE'/ 鎊Q%ϟLM d`̊ɮrs 2岓>gb5~TwK IfpBadcU&k%jMU?2ё̥B띨z6VXsE{a*^r?E< Kh*Yȅ>Y"/(1!S:q:'myh(1E"#0rDҸ`}ێOڝc\@\+haf<•JH%xtQ)_2Lv?5s[y}J v$KF$ky ZKka[濦 Cp).kuh]g$Y-cX7W RK唾X?[@!_N\(`o)$mZ]ɯl\Pobp'zW~+?,:pKߤ.VɊTT=Z޽ dF-. ZXX4_>'W=`q>`jz*rMi6% 6ob`gVIv1[WO 2srSrorV<4@IL b77W~Z|; q=DvݚNz8 k梙. )STJ(^})g7U 2~hSU1`10M*se2n:5:hB);Km]Wk^GZDGA2%r"&/<7ܺ Hz{^p'7N #ٿB`T߇},m`llA2#dxX[=ECPl麫q)MK7duDQ+ϜKeM.=MAkA^Ym`{>M,&2-8o9\YrtgpBHAHj9TkM Q5Ѡzg| F`C{4% F4hZSvFCM5ńR +L8sx4W+PО*۠]>wЧR~{6)p0{fhOG+J7AK8:M6Z&j5]=7F hg)o7^ +ۀOr IXHc{ \*4S\4\"Z1iMv)O5F(Op[k*3H6I hE-$H𨖇n4PYAZm% %t:l[*xIgX}gG" FEyn$ʷ>x !Carq~Yx054HDInkHѐ2.b2'-vv2RvRЧ4 4 0xpHE)H <(o Ġ;79;R;l9/J&&\!Jw'w_Ÿ TMfczYXe$#(U4ނ'JȃֆKS}.|B$\QZ\tڟ|x4;;-,61z'@" 6wA 1!jJn3Ӛaߍσ>ٍ5O-0BٙwS:OgUցcLty 4|~oy9l-Cb"(+^zʝ1NFKSnr-HF@ۧ.|D@tDdEy7W6ڸs|]MR* Qm2b`rI;DaF۞x ʹWZJjK$C 4 <851 8 `'8A1 <9!ۙv/3ӣY*4+/!uJE^U:Xb!|_>U2I~%$ݯr2H1H 5HB3 ӄ*TRs6ƍ_S6wkt;W3%^}.:vχٓ3nDsL5!G YeOd~uԙap":^׸PJ0^tFø.@ hX?~yPˎC>L)|,1Xz>IjTkj2SLVc޿=鷓d諼-$G3'㤟>OYy`~|i҇nf,@vMdfx@RH b,BO+y .Pl|އɯ#,|l(@rq:ErL|@(P@< -i,GwV(;;?Է_KjBϦGĝFp6 zN-D9i^ ;Zi G39>E* ^r]C yӭAQtV:K_DEQjTKע$ƕv)G4 7kqЭ6值BDǝËjNiTxLu04l/E87&gqxq "H5 C:w{ix?v+ـZJ4#jRn뷱XiPʸc|/-WT\KT Wj,3liB\\v~bj4 G|>8 tc<S$HxReg1T e*vk:hG45Ѵ[+5[hb?v[3))[SaF="MrѴ[bڭ YNfWUoE'TTqjUH]BFM Q X8h:X!5v-/=Ûw}kBzXMM.5yJ]gix舾r~%E]q ."U"kZ&3PO4AD~BZ| #fÐ+Y31`l6q@!w|) dBD v΂E{ E tzfVVi 0J5_yk$-ױNu0)t_ -H4.\c[8r✭@&':v WKp|X&Khl Z0F =XYe-˱tS:,w| 잷CQ9Gn(u omN>myHrypx\b Y\5"zRlTJ$2/I&2׻a#]5& ۉw5[,|T' "FL1ҞD] Roh潂JU|6n{ִҨh"z??0Şp !/ܬE{ űQd"bl\c@~_ӞmY9{CIlOEaZ}{E7H.4{֦{([@11R{Pm7ǂ9>Ьbu|IJedf#.Dۻ6@We4ICޗA؝G#pٕ,۳ڕ %!g8r4HTĄB6'2-`쨅v'XI=*-k`=lĀ&RstJMEJ;w ^X^3%<;D+ Ao|)&X󓖢Rs -pũtY:يl%-h4`MHU/Y-{ (W=gea2.3ϩ)3f׌(oeR$S9<H:XoB,!Pc@Z"B; 5R ;1U:;ʖCHpA)+o˥)>bQ06+ln2 jEڈs ASpc襑RZek:r(K˂*osg"-n ҈cgf}(HncUͬdq %ֲ́PF3iVj#,BZQ5,c S۩/v\].D>ǹ2*ƋŪ<>qxcYOl=϶o[\7E81ǬXŕ~XQhG;Oыb[?,=X2Goq)j銞JYl2O9M-AYg gh.+Hި>@rȁ:xF-`ȵ}}4zlo<;,T!F6Qa8z}cyR!8W^3)=ϋ-*&Fއ*8jw HÁ2+;}š19aG(O\9pv롗To!o*6Mb?l%Q?jY-eoOw_nh˛ \,h]4xͷ;&PUty2HfQ4˹:i(aJF#y$/qt;Fn87*8pgZ4Pt^xfrSp SPLPj z&bwT" s32K%(O:$YT`d.b hȳ"1tH$N =@P@ 2 Bzr]L-@`1`881#єz+yftY.BSЦ-Jb()^JKzlc[n5UG5q6qJ0u .7җ"1;i5/Թ nFIFBy2ȍΌp*ycW{i; .mBI7*fzoGcW1]̹5B;4={]61CSH7ޭC~đ .0CsJzAPȴr\Q2ܲLu!]#%Į9Z7>3K)sb7h (2pKQdAg35J^BP*٤ 5YMcR%ۑ|!J%=c& 1$JZMՒ‹+y?YM2)7'dWHPhLJ/b/]aXhyB`{ Ka6R~3xM2/U>RmΙз?р|I [S&/%_*P4XGu7V*xeoڰH<$䐡-x?%䝤*Vln|5ǛF7G"Ēkǫ*3jk8^^\>6[qd ہ<ޅPC%7mۃۗ67`IxS foN>A{)?ֲOԨ0a7BV4 ϓV:ay[ Vdkd ״7|7쾻xR}~P=cS~+[Hz#I }_}r)'!ޠҩD9(GګxzmA> N+'\{ہjI=ôѦgzww^wqKm6#n'˙qaJ5Zws8%!+u]}甮~4fШNkNdFèn]p\?qb5Ƕub|~zbb"p>n[t0Պн{PmW\-a>4X/TTU+g60G/}7M4zq tuБZVUXo}[c+vpS<he|E@eobU󆲑 )m|l-W7Js2u, ŴW㸾j(]nigDJȾQ7rP̧l ُR}Qݱ'AV$: tPE:\NP_ZP:r:oLc -Zϋ$^B(s9҉dA- B,XV½i@o3jUՐ( v$u0tI׿- =e)E=ؤې@std/VP{̻]5S;~m1HNi&;~Oh浒4kh_UNA v_m]f"[%I0-([ Ym4=]Vl4䅫NDjϐ<ϐg4Wp.V&4y$~v'K?RHCMoU`U FYFb7Eri6d*a2Mv;7 JF#EK-%!|Y8b[\h@Ci]( kܮK uye~U\Ǐְ*Ff;/Vbo]kJPCj5D-M!(Vue^7@m5u0 Ax'xabCR)%^Grq2RdB 14811ms (05:48:42.569) Feb 18 05:48:42 crc kubenswrapper[4869]: Trace[830441728]: [14.811091202s] [14.811091202s] END Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.569820 4869 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:42 crc kubenswrapper[4869]: E0218 05:48:42.571462 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.573081 4869 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.573159 4869 trace.go:236] Trace[343611954]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:48:27.599) (total time: 14973ms): Feb 18 05:48:42 crc kubenswrapper[4869]: Trace[343611954]: ---"Objects listed" error: 14973ms (05:48:42.573) Feb 18 05:48:42 crc kubenswrapper[4869]: Trace[343611954]: [14.973591755s] [14.973591755s] END Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.573186 4869 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.574501 4869 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:42 crc kubenswrapper[4869]: E0218 05:48:42.574659 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.575881 4869 trace.go:236] Trace[1423577906]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:48:30.753) (total time: 11822ms): Feb 18 05:48:42 crc kubenswrapper[4869]: Trace[1423577906]: ---"Objects listed" error: 11822ms (05:48:42.575) Feb 18 05:48:42 crc kubenswrapper[4869]: Trace[1423577906]: [11.822381851s] [11.822381851s] END Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.575903 4869 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:42 crc kubenswrapper[4869]: I0218 05:48:42.587877 4869 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.022181 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.023226 4869 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.023331 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.028367 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.401122 4869 apiserver.go:52] "Watching apiserver" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.408567 4869 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.408990 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.409495 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.409608 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.409628 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.409680 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.409736 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.410021 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.410178 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.410221 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.410457 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.411491 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.412512 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.412944 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.413033 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.413059 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.413241 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.413278 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.413724 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.414721 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.420157 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:43:16.592159326 +0000 UTC Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.445257 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.465156 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.483732 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.496124 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.507326 4869 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.510697 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e7e81a8-99d2-4752-8198-fbf3f6bfa860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.525328 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.537617 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.547391 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.558295 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.569947 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.571476 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.573602 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2" exitCode=255 Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.573866 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2"} Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.576882 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.576927 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.576960 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.576993 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577021 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577052 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577082 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577108 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577135 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577162 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577184 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577204 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577225 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577252 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577252 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577245 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577416 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577522 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577533 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577571 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577600 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577653 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577674 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577914 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577933 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577952 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.577995 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578014 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578032 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578065 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578082 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578125 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578190 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578208 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578245 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578265 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578344 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578362 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578381 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578416 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578432 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578451 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578484 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578501 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578520 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578534 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578568 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578658 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578679 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578695 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578761 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578778 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578799 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578815 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578847 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578863 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578879 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578919 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578936 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578952 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578969 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579001 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579019 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579082 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579102 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579118 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579152 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579171 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579189 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579206 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579239 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579259 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579276 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579309 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579326 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579341 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579359 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579424 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579442 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579458 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579499 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579518 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579533 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579653 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579672 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579688 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579703 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579765 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579784 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579822 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579846 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579862 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579897 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579913 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579930 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579947 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580000 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580018 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580051 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580072 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580089 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580106 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580138 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580157 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580172 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580203 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580222 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580484 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580502 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580519 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580881 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581013 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581053 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581150 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581169 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581187 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581649 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581930 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581949 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582001 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582028 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582632 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582671 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582727 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582765 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582783 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582801 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582816 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582833 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582848 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582869 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582887 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582904 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582923 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582939 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.582957 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583051 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583081 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583106 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583127 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583147 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583168 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583191 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583228 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583443 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583477 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583496 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584052 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584187 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584219 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584364 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584402 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584446 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584474 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584500 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584550 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584577 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584627 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584659 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585052 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585131 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585156 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585209 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585230 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585249 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585297 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585317 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585336 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585353 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585390 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585408 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585426 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585464 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585541 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585563 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585580 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585636 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585657 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585674 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585708 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585725 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585770 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585791 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585810 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585878 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585926 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585946 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585966 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586012 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586068 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586111 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586133 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586193 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586213 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586271 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586293 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586425 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586438 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586448 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589777 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578147 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590022 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590162 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590487 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590552 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590642 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590802 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590872 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590894 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.590924 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:44.090900321 +0000 UTC m=+21.259988623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590933 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590928 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590962 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591003 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.590087 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578493 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578557 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578609 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578710 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578822 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578995 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591293 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579115 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579300 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579449 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579560 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579633 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579712 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579872 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579779 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.579975 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580006 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580136 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580395 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580596 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.580655 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581624 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581651 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.581697 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.583851 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584224 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584361 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584466 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584734 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584773 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.584840 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585034 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585112 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585570 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585587 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585609 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585649 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585771 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.585839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586054 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586116 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586359 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586534 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586596 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586916 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.586961 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.587902 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.588662 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589010 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589096 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589100 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589128 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589355 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589582 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.589610 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.578370 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591448 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591726 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591735 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.591890 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.592049 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.592125 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.590089 4869 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.592536 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.592881 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.592957 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593067 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593131 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593231 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593303 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593432 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593471 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.593603 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593616 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.593660 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:44.093643624 +0000 UTC m=+21.262731856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593725 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593947 4869 scope.go:117] "RemoveContainer" containerID="98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.593884 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594285 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594290 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594335 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594702 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594789 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.594860 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595133 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595165 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595169 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595649 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595843 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595842 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.595923 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.596160 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.596393 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.596974 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597111 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597129 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597729 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597830 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597849 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.597895 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.598198 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.598257 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.598551 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.598955 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.599093 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.599109 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.599198 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:44.099171502 +0000 UTC m=+21.268259754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.599938 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.600279 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.600359 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.600379 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.600908 4869 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601001 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601095 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601147 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601321 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601401 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.601840 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.602583 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.602706 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.602950 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.603006 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.603100 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.603120 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.603134 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.603171 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.603190 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:44.103171955 +0000 UTC m=+21.272260187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.603471 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e7e81a8-99d2-4752-8198-fbf3f6bfa860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604181 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604301 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604556 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604706 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604854 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.604972 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.607979 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.608565 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.611874 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.611897 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.611910 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:43 crc kubenswrapper[4869]: E0218 05:48:43.611966 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:44.111948278 +0000 UTC m=+21.281036510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.612502 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.614209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.614341 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.615154 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.615161 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.615246 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.615352 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.615350 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.616172 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617209 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617322 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617371 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617594 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617667 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617925 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617953 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.617975 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.618253 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.618543 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.618727 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.618960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.619034 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.622136 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623021 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623155 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623513 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623593 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623661 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623720 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623802 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.623834 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.625457 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.626328 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.629974 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.630297 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.634960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.635576 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.635790 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.635828 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.636050 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.636466 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.636813 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.636879 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.637147 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.637867 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639143 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639317 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639471 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639467 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639636 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639727 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.639979 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.640489 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.649200 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.650644 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.658142 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.658957 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.659509 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.659618 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.659862 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.663663 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.665206 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.669339 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.669386 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.677482 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e7e81a8-99d2-4752-8198-fbf3f6bfa860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T05:48:42Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0218 05:48:36.864320 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 05:48:36.866778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-322554816/tls.crt::/tmp/serving-cert-322554816/tls.key\\\\\\\"\\\\nI0218 05:48:42.581357 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 05:48:42.587357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 05:48:42.587407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 05:48:42.587446 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 05:48:42.587459 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 05:48:42.599362 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 05:48:42.599408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599419 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 05:48:42.599435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 05:48:42.599442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 05:48:42.599449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 05:48:42.599450 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 05:48:42.602493 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687532 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687656 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687769 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687785 4869 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687799 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687837 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687874 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687885 4869 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687932 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687946 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687956 4869 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687965 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687975 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687984 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.687995 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688003 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688012 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688020 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688029 4869 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688039 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688047 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688055 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688064 4869 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688073 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688081 4869 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688090 4869 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688100 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688112 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688123 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688135 4869 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688146 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688157 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688168 4869 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688179 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688191 4869 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688208 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688218 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688230 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688241 4869 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688253 4869 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688261 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688271 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688280 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688290 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688299 4869 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688308 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688317 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688326 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688336 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688345 4869 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688353 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688361 4869 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688370 4869 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688378 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688387 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688396 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688405 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688413 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688422 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688430 4869 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688438 4869 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688448 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688457 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688467 4869 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688476 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688484 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688493 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688503 4869 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688512 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688520 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688529 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688538 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688547 4869 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688557 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688565 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688574 4869 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688587 4869 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688599 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688612 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688622 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688632 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688641 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688649 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688659 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688667 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688675 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688865 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688876 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688884 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688892 4869 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688902 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688910 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688919 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688927 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688936 4869 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688946 4869 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688956 4869 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688965 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688973 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688982 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688990 4869 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.688998 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689006 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689023 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689031 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689039 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689048 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689057 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689065 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689074 4869 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689082 4869 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689091 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689098 4869 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689106 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689114 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689122 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689131 4869 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689139 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689148 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689156 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689165 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689173 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689181 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689189 4869 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689198 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689206 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689214 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689224 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689232 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689241 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689250 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689259 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689267 4869 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689278 4869 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689290 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689301 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689312 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689324 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689333 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689342 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689351 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689360 4869 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689369 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689377 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689390 4869 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689404 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689417 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689428 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689436 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689445 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689453 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689461 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689470 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689482 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689496 4869 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689507 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689518 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689529 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689538 4869 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689547 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689556 4869 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689566 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689578 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689594 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689608 4869 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689620 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689631 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689643 4869 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689653 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689823 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689838 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689850 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689861 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689872 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689883 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689894 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689908 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689920 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689953 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689969 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689980 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.689993 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690005 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690015 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690026 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690037 4869 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690048 4869 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690060 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690071 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690081 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690094 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690104 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690115 4869 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690126 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.690137 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.699685 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.707438 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.715062 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.720948 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.721082 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.722727 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.728129 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.730583 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:43 crc kubenswrapper[4869]: I0218 05:48:43.732711 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:43 crc kubenswrapper[4869]: W0218 05:48:43.741974 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-045937650ce2847246f2fcde795a1179cbd5088ca680c0f3072d5cd9f064bd80 WatchSource:0}: Error finding container 045937650ce2847246f2fcde795a1179cbd5088ca680c0f3072d5cd9f064bd80: Status 404 returned error can't find the container with id 045937650ce2847246f2fcde795a1179cbd5088ca680c0f3072d5cd9f064bd80 Feb 18 05:48:43 crc kubenswrapper[4869]: W0218 05:48:43.748485 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0a380c2419a0caf112179146314492910f18a05d77ed7c12fa9c063977910c00 WatchSource:0}: Error finding container 0a380c2419a0caf112179146314492910f18a05d77ed7c12fa9c063977910c00: Status 404 returned error can't find the container with id 0a380c2419a0caf112179146314492910f18a05d77ed7c12fa9c063977910c00 Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.092847 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.093149 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:45.09311339 +0000 UTC m=+22.262201672 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.194697 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.194768 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.194801 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.194826 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.194929 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.194983 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:45.194967268 +0000 UTC m=+22.364055500 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195057 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195074 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195085 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195113 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:45.195104701 +0000 UTC m=+22.364192933 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195148 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195175 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:45.195167663 +0000 UTC m=+22.364255895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195224 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195237 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195245 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:44 crc kubenswrapper[4869]: E0218 05:48:44.195269 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:45.195261485 +0000 UTC m=+22.364349717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.420983 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:47:26.661362377 +0000 UTC Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.578491 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0a380c2419a0caf112179146314492910f18a05d77ed7c12fa9c063977910c00"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.580095 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"baae34fc7e1c9392e95fce40e3ff30f70a407767e08dca8c34c31f31421b3852"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.580119 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"045937650ce2847246f2fcde795a1179cbd5088ca680c0f3072d5cd9f064bd80"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.582235 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d838bfd1d3750fe6905b78bd65c9aa739cf73abb6a2dea431bb40f41c76bc196"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.582297 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c2fcd51d1191d06852236bde34127bf20d208c066cbf9c7a15c5b9a1a6ae12ff"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.582312 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7f4ff016a7001485845fa5a22235b20381aaf97cabfab9ba40f8694640a7f95c"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.584940 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.588305 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d"} Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.588723 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.601402 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.631601 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.676218 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e7e81a8-99d2-4752-8198-fbf3f6bfa860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T05:48:42Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0218 05:48:36.864320 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 05:48:36.866778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-322554816/tls.crt::/tmp/serving-cert-322554816/tls.key\\\\\\\"\\\\nI0218 05:48:42.581357 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 05:48:42.587357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 05:48:42.587407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 05:48:42.587446 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 05:48:42.587459 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 05:48:42.599362 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 05:48:42.599408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599419 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 05:48:42.599435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 05:48:42.599442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 05:48:42.599449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 05:48:42.599450 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 05:48:42.602493 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.697006 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.725353 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baae34fc7e1c9392e95fce40e3ff30f70a407767e08dca8c34c31f31421b3852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.738846 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.751726 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.766614 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.785410 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baae34fc7e1c9392e95fce40e3ff30f70a407767e08dca8c34c31f31421b3852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.797412 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.807103 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.818544 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e7e81a8-99d2-4752-8198-fbf3f6bfa860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T05:48:42Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0218 05:48:36.864320 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 05:48:36.866778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-322554816/tls.crt::/tmp/serving-cert-322554816/tls.key\\\\\\\"\\\\nI0218 05:48:42.581357 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 05:48:42.587357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 05:48:42.587407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 05:48:42.587446 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 05:48:42.587459 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 05:48:42.599362 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 05:48:42.599408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599419 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 05:48:42.599428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 05:48:42.599435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 05:48:42.599442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 05:48:42.599449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 05:48:42.599450 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 05:48:42.602493 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.831395 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:44 crc kubenswrapper[4869]: I0218 05:48:44.845754 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d838bfd1d3750fe6905b78bd65c9aa739cf73abb6a2dea431bb40f41c76bc196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2fcd51d1191d06852236bde34127bf20d208c066cbf9c7a15c5b9a1a6ae12ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:44Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.103310 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.103467 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.103449294 +0000 UTC m=+24.272537526 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.204726 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.204783 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.204805 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.204821 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204885 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204931 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.204915913 +0000 UTC m=+24.374004155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204934 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204982 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204996 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.204939 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205055 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.205035346 +0000 UTC m=+24.374123578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205093 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.205079527 +0000 UTC m=+24.374167869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205114 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205145 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205157 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.205214 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.205199099 +0000 UTC m=+24.374287331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.421842 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:04:20.319034485 +0000 UTC Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.469224 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.469325 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.469427 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.469454 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.469541 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:45 crc kubenswrapper[4869]: E0218 05:48:45.469636 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.475571 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.476528 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.479344 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.480044 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.480610 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.481079 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.481641 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.482201 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.482867 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.483391 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.483910 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.484516 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.485018 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.485491 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.485978 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.486446 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.489128 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.489565 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.490138 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.491077 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.491483 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.492371 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.492942 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.493898 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.494268 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.494857 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.495822 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.496319 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.497245 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.497655 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.498513 4869 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.498619 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.500273 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.500778 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.501573 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.503303 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.503912 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.504759 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.505338 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.506356 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.506905 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.507855 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.508438 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.509605 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.510056 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.510946 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.511686 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.512781 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.513340 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.514364 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.514855 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.515678 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.516243 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 18 05:48:45 crc kubenswrapper[4869]: I0218 05:48:45.516671 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.351306 4869 csr.go:261] certificate signing request csr-j7rg2 is approved, waiting to be issued Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.371135 4869 csr.go:257] certificate signing request csr-j7rg2 is issued Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.422461 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:48:15.102443756 +0000 UTC Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.455912 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rw6ns"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.456185 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.458302 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.459317 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.459337 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.459365 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.520710 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.520692398 podStartE2EDuration="3.520692398s" podCreationTimestamp="2026-02-18 05:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:46.519310534 +0000 UTC m=+23.688398776" watchObservedRunningTime="2026-02-18 05:48:46.520692398 +0000 UTC m=+23.689780630" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.594717 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6ea0e960a05beed58883ede743b2c826b1df867e3d7eead98266f2b25ee20bc7"} Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.615377 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-47b4c"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.615657 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.617203 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/194aaf06-f563-4f03-aee6-c6e124706ef0-host\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.617250 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/194aaf06-f563-4f03-aee6-c6e124706ef0-serviceca\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.617271 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99b5j\" (UniqueName: \"kubernetes.io/projected/194aaf06-f563-4f03-aee6-c6e124706ef0-kube-api-access-99b5j\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.618039 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.618204 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.618590 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.656844 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-lv6qh"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.657194 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.659194 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.659387 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.659535 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.661321 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.661405 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.708943 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5kq9x"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.713461 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.714225 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2gzwj"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.714664 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzb5v\" (UniqueName: \"kubernetes.io/projected/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-kube-api-access-vzb5v\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720060 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/194aaf06-f563-4f03-aee6-c6e124706ef0-serviceca\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720076 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720083 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99b5j\" (UniqueName: \"kubernetes.io/projected/194aaf06-f563-4f03-aee6-c6e124706ef0-kube-api-access-99b5j\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/194aaf06-f563-4f03-aee6-c6e124706ef0-host\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720540 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/194aaf06-f563-4f03-aee6-c6e124706ef0-host\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720621 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-hosts-file\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.720937 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721022 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721115 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721150 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721252 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721396 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/194aaf06-f563-4f03-aee6-c6e124706ef0-serviceca\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.721700 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.723888 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ckzlt"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.724446 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: E0218 05:48:46.724600 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.743008 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99b5j\" (UniqueName: \"kubernetes.io/projected/194aaf06-f563-4f03-aee6-c6e124706ef0-kube-api-access-99b5j\") pod \"node-ca-rw6ns\" (UID: \"194aaf06-f563-4f03-aee6-c6e124706ef0\") " pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.769309 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rw6ns" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-cnibin\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzb5v\" (UniqueName: \"kubernetes.io/projected/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-kube-api-access-vzb5v\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-hostroot\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821718 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/781aec66-5fc7-4161-a704-cc78830d525d-mcd-auth-proxy-config\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821737 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821769 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821788 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cni-binary-copy\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821805 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-conf-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821820 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-daemon-config\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821836 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhjp\" (UniqueName: \"kubernetes.io/projected/a6bdacde-403f-48ac-8366-547278b19432-kube-api-access-9qhjp\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821854 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/781aec66-5fc7-4161-a704-cc78830d525d-rootfs\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821870 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-multus\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821902 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph5vd\" (UniqueName: \"kubernetes.io/projected/781aec66-5fc7-4161-a704-cc78830d525d-kube-api-access-ph5vd\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-hosts-file\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821944 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-bin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821972 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmc5h\" (UniqueName: \"kubernetes.io/projected/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-kube-api-access-wmc5h\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.821999 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-system-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cnibin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-os-release\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-socket-dir-parent\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822070 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/87706ebb-d517-4b38-a542-d0afd6c8c9c2-kube-api-access-4c642\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822093 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822110 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-netns\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822130 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-k8s-cni-cncf-io\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822127 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-hosts-file\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822163 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-etc-kubernetes\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-os-release\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822208 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822231 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-kubelet\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822247 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/781aec66-5fc7-4161-a704-cc78830d525d-proxy-tls\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822261 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-system-cni-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.822282 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-multus-certs\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.844071 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzb5v\" (UniqueName: \"kubernetes.io/projected/296f0ea2-45d7-4cde-bd8f-09f01b56b82b-kube-api-access-vzb5v\") pod \"node-resolver-47b4c\" (UID: \"296f0ea2-45d7-4cde-bd8f-09f01b56b82b\") " pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.879688 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvs9q"] Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.880491 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.884897 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885173 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885333 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885470 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885613 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885792 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.885977 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.922849 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/781aec66-5fc7-4161-a704-cc78830d525d-rootfs\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923248 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-multus\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923510 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph5vd\" (UniqueName: \"kubernetes.io/projected/781aec66-5fc7-4161-a704-cc78830d525d-kube-api-access-ph5vd\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923603 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-bin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923696 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmc5h\" (UniqueName: \"kubernetes.io/projected/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-kube-api-access-wmc5h\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924122 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-system-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924225 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-os-release\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924305 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-socket-dir-parent\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924374 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-system-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923641 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-multus\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924390 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/87706ebb-d517-4b38-a542-d0afd6c8c9c2-kube-api-access-4c642\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/781aec66-5fc7-4161-a704-cc78830d525d-rootfs\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924128 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.923712 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-cni-bin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924532 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-socket-dir-parent\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924600 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cnibin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924499 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cnibin\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924655 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-os-release\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924678 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-netns\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924782 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-k8s-cni-cncf-io\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924805 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-etc-kubernetes\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924825 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-os-release\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-netns\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-kubelet\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924874 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-k8s-cni-cncf-io\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924880 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/781aec66-5fc7-4161-a704-cc78830d525d-proxy-tls\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924915 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-etc-kubernetes\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924932 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-var-lib-kubelet\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924905 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-os-release\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924956 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-system-cni-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924972 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-cni-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.924997 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925019 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-system-cni-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925066 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-multus-certs\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: E0218 05:48:46.925116 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925162 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-host-run-multus-certs\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925124 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-cnibin\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925165 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-cnibin\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: E0218 05:48:46.925206 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs podName:1c4bcbdd-2490-4d47-b2b3-a2e832c63100 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:47.425184394 +0000 UTC m=+24.594272626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs") pod "network-metrics-daemon-ckzlt" (UID: "1c4bcbdd-2490-4d47-b2b3-a2e832c63100") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925265 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-hostroot\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925287 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/781aec66-5fc7-4161-a704-cc78830d525d-mcd-auth-proxy-config\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925319 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925342 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925364 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-conf-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925386 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-daemon-config\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925383 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-hostroot\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925406 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhjp\" (UniqueName: \"kubernetes.io/projected/a6bdacde-403f-48ac-8366-547278b19432-kube-api-access-9qhjp\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-conf-dir\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.925501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cni-binary-copy\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.926131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6bdacde-403f-48ac-8366-547278b19432-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.926360 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-multus-daemon-config\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.927184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/781aec66-5fc7-4161-a704-cc78830d525d-mcd-auth-proxy-config\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.927210 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-47b4c" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.927218 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6bdacde-403f-48ac-8366-547278b19432-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.927927 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/781aec66-5fc7-4161-a704-cc78830d525d-proxy-tls\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.928102 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87706ebb-d517-4b38-a542-d0afd6c8c9c2-cni-binary-copy\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.947656 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph5vd\" (UniqueName: \"kubernetes.io/projected/781aec66-5fc7-4161-a704-cc78830d525d-kube-api-access-ph5vd\") pod \"machine-config-daemon-lv6qh\" (UID: \"781aec66-5fc7-4161-a704-cc78830d525d\") " pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.948869 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/87706ebb-d517-4b38-a542-d0afd6c8c9c2-kube-api-access-4c642\") pod \"multus-2gzwj\" (UID: \"87706ebb-d517-4b38-a542-d0afd6c8c9c2\") " pod="openshift-multus/multus-2gzwj" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.952979 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhjp\" (UniqueName: \"kubernetes.io/projected/a6bdacde-403f-48ac-8366-547278b19432-kube-api-access-9qhjp\") pod \"multus-additional-cni-plugins-5kq9x\" (UID: \"a6bdacde-403f-48ac-8366-547278b19432\") " pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.954006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmc5h\" (UniqueName: \"kubernetes.io/projected/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-kube-api-access-wmc5h\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:46 crc kubenswrapper[4869]: I0218 05:48:46.967053 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026679 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026774 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026845 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026866 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026884 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026903 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026919 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026961 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.026983 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027040 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027056 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxc85\" (UniqueName: \"kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027079 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027095 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027111 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027149 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.027164 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.029529 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.038389 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gzwj" Feb 18 05:48:47 crc kubenswrapper[4869]: W0218 05:48:47.048164 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6bdacde_403f_48ac_8366_547278b19432.slice/crio-aad8ca6ba14f75bf9ade395e8a669512e3db3d780c27c1b8c4029ba86bf480c3 WatchSource:0}: Error finding container aad8ca6ba14f75bf9ade395e8a669512e3db3d780c27c1b8c4029ba86bf480c3: Status 404 returned error can't find the container with id aad8ca6ba14f75bf9ade395e8a669512e3db3d780c27c1b8c4029ba86bf480c3 Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128115 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128278 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128304 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.128338 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:51.128305948 +0000 UTC m=+28.297394180 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128368 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128434 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128442 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128468 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128531 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128607 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128581 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128668 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128714 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128752 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxc85\" (UniqueName: \"kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128756 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128782 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128715 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128803 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128853 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128870 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128859 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128892 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128913 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128963 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.128982 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129002 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129079 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129102 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129262 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129339 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129442 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.129477 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.131482 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.131838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.133040 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.158466 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.168852 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxc85\" (UniqueName: \"kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85\") pod \"ovnkube-node-mvs9q\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.198038 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.229850 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.230274 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.230307 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.230326 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230394 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230443 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:51.230430154 +0000 UTC m=+28.399518386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230814 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230837 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230848 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230876 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:51.230867494 +0000 UTC m=+28.399955726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230916 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230939 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:51.230933666 +0000 UTC m=+28.400021898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230977 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230987 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.230995 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.231066 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:51.231059179 +0000 UTC m=+28.400147411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.320998 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns"] Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.321457 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.325626 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.325968 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.372646 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-18 05:43:46 +0000 UTC, rotation deadline is 2026-11-09 03:14:28.057616343 +0000 UTC Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.372735 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6333h25m40.684884556s for next certificate rotation Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.423120 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:00:39.052357514 +0000 UTC Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.432971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.433040 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.433072 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.433097 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.433117 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x784k\" (UniqueName: \"kubernetes.io/projected/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-kube-api-access-x784k\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.433217 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.433324 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs podName:1c4bcbdd-2490-4d47-b2b3-a2e832c63100 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:48.433299353 +0000 UTC m=+25.602387585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs") pod "network-metrics-daemon-ckzlt" (UID: "1c4bcbdd-2490-4d47-b2b3-a2e832c63100") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.469210 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.469276 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.469348 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.469449 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.469470 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:47 crc kubenswrapper[4869]: E0218 05:48:47.469549 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.534242 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x784k\" (UniqueName: \"kubernetes.io/projected/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-kube-api-access-x784k\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.534338 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.534369 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.534398 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.535065 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.535257 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.539372 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.551894 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x784k\" (UniqueName: \"kubernetes.io/projected/afccc8b0-8dc9-44d8-80cc-33cd8ab4400e-kube-api-access-x784k\") pod \"ovnkube-control-plane-749d76644c-qspns\" (UID: \"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.600549 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-47b4c" event={"ID":"296f0ea2-45d7-4cde-bd8f-09f01b56b82b","Type":"ContainerStarted","Data":"8abe6454939d0656a70665907358a891b56d0648ebb0b5a6be144954f62f2cbe"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.600617 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-47b4c" event={"ID":"296f0ea2-45d7-4cde-bd8f-09f01b56b82b","Type":"ContainerStarted","Data":"7092311c6377daffa866a136c6f3a8da0f4805fb37bfe5a0b1f17098696b9c42"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.602153 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rw6ns" event={"ID":"194aaf06-f563-4f03-aee6-c6e124706ef0","Type":"ContainerStarted","Data":"74818a805922b9ba9b6cb777d1dfa278740dedb9d767169ad0384a7015a07f54"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.602205 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rw6ns" event={"ID":"194aaf06-f563-4f03-aee6-c6e124706ef0","Type":"ContainerStarted","Data":"797066dee8d697ffd454c8848f638c22e31880955fdceec13f9f4d611f879760"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.604212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gzwj" event={"ID":"87706ebb-d517-4b38-a542-d0afd6c8c9c2","Type":"ContainerStarted","Data":"223dc0969547c24dfedc3291abfc2d364a072bd0ed0624e883a151464a9e90dc"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.604253 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gzwj" event={"ID":"87706ebb-d517-4b38-a542-d0afd6c8c9c2","Type":"ContainerStarted","Data":"95b44c362be9f8dfbc7976dfd25b50f3a126e64a5081f8f7cb4ebd7cd77a7445"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.605792 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54" exitCode=0 Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.605841 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.605902 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"227b27eeb303280cfbe417c3902c92fda4c4946cf1a5d23602243928aadbedf0"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.607352 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerStarted","Data":"dcde8c77341f16353ec6c46a6485a903fe26eb9c7886537f6a6cc15d8f936238"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.607416 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerStarted","Data":"aad8ca6ba14f75bf9ade395e8a669512e3db3d780c27c1b8c4029ba86bf480c3"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.609863 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"9ce674b0f6e223b8f6b6b02923f38957d03fde023f27c5253b2e79c1bd016e07"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.609907 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.609929 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"73baa4f93add61f13e9acd88df5d49623b4bc98e3ba49725d86219189afa3d4b"} Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.638526 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.648147 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-47b4c" podStartSLOduration=1.6481117109999999 podStartE2EDuration="1.648111711s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:47.619527996 +0000 UTC m=+24.788616238" watchObservedRunningTime="2026-02-18 05:48:47.648111711 +0000 UTC m=+24.817199993" Feb 18 05:48:47 crc kubenswrapper[4869]: W0218 05:48:47.651154 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafccc8b0_8dc9_44d8_80cc_33cd8ab4400e.slice/crio-74f92e6e7c1625e933059dc070d047050dba6600d1a6fbff55a7bc511d744a5a WatchSource:0}: Error finding container 74f92e6e7c1625e933059dc070d047050dba6600d1a6fbff55a7bc511d744a5a: Status 404 returned error can't find the container with id 74f92e6e7c1625e933059dc070d047050dba6600d1a6fbff55a7bc511d744a5a Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.671683 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2gzwj" podStartSLOduration=1.671651854 podStartE2EDuration="1.671651854s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:47.67023044 +0000 UTC m=+24.839318682" watchObservedRunningTime="2026-02-18 05:48:47.671651854 +0000 UTC m=+24.840740116" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.685303 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podStartSLOduration=1.685273176 podStartE2EDuration="1.685273176s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:47.685202384 +0000 UTC m=+24.854290636" watchObservedRunningTime="2026-02-18 05:48:47.685273176 +0000 UTC m=+24.854361418" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.728141 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rw6ns" podStartSLOduration=1.728116348 podStartE2EDuration="1.728116348s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:47.726806536 +0000 UTC m=+24.895894768" watchObservedRunningTime="2026-02-18 05:48:47.728116348 +0000 UTC m=+24.897204570" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.843072 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.847484 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:47 crc kubenswrapper[4869]: I0218 05:48:47.853960 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.424069 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 09:34:26.080215637 +0000 UTC Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.445076 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:48 crc kubenswrapper[4869]: E0218 05:48:48.445299 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:48 crc kubenswrapper[4869]: E0218 05:48:48.445423 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs podName:1c4bcbdd-2490-4d47-b2b3-a2e832c63100 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:50.445395448 +0000 UTC m=+27.614483860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs") pod "network-metrics-daemon-ckzlt" (UID: "1c4bcbdd-2490-4d47-b2b3-a2e832c63100") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.470165 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:48 crc kubenswrapper[4869]: E0218 05:48:48.470340 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.625111 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" event={"ID":"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e","Type":"ContainerStarted","Data":"8a5eb6c5fe9db5abe9574f712bc143b311171f5e296d8b3b09d01d436c572f45"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.625165 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" event={"ID":"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e","Type":"ContainerStarted","Data":"d70bfc59623d2ec7bfefdc6cf08ff63ddf588e73957b3c5170f203e5432a4a98"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.625176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" event={"ID":"afccc8b0-8dc9-44d8-80cc-33cd8ab4400e","Type":"ContainerStarted","Data":"74f92e6e7c1625e933059dc070d047050dba6600d1a6fbff55a7bc511d744a5a"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.628761 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.628862 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.628928 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.628993 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.629046 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2"} Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.630681 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="dcde8c77341f16353ec6c46a6485a903fe26eb9c7886537f6a6cc15d8f936238" exitCode=0 Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.630808 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"dcde8c77341f16353ec6c46a6485a903fe26eb9c7886537f6a6cc15d8f936238"} Feb 18 05:48:48 crc kubenswrapper[4869]: E0218 05:48:48.637961 4869 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.674947 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qspns" podStartSLOduration=1.6749221749999998 podStartE2EDuration="1.674922175s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:48.674400473 +0000 UTC m=+25.843488715" watchObservedRunningTime="2026-02-18 05:48:48.674922175 +0000 UTC m=+25.844010407" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.675570 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.675561592 podStartE2EDuration="1.675561592s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:48.658380194 +0000 UTC m=+25.827468416" watchObservedRunningTime="2026-02-18 05:48:48.675561592 +0000 UTC m=+25.844649844" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.975449 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.978112 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.978163 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.978172 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.978308 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.986364 4869 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.986774 4869 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.988249 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.988296 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.988309 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.988331 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 05:48:48 crc kubenswrapper[4869]: I0218 05:48:48.988343 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T05:48:48Z","lastTransitionTime":"2026-02-18T05:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.042799 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq"] Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.043228 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.045105 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.046979 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.047026 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.047691 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.154018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.154082 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.154276 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.154332 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.154364 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.255861 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.255946 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.256022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.256058 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.256060 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.256133 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.256286 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.258075 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.265593 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.274409 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxmpq\" (UID: \"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.355427 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" Feb 18 05:48:49 crc kubenswrapper[4869]: W0218 05:48:49.386614 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f83cd7_4d21_48d4_b6d0_ae64ae0c4f1d.slice/crio-791e86f8f65b4a81c185dfee66527ac577fe99b5e80c1f1816ea9d774b1ac7da WatchSource:0}: Error finding container 791e86f8f65b4a81c185dfee66527ac577fe99b5e80c1f1816ea9d774b1ac7da: Status 404 returned error can't find the container with id 791e86f8f65b4a81c185dfee66527ac577fe99b5e80c1f1816ea9d774b1ac7da Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.425786 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:43:29.485315799 +0000 UTC Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.425917 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.433284 4869 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.469554 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.469689 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:49 crc kubenswrapper[4869]: E0218 05:48:49.469696 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.469551 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:49 crc kubenswrapper[4869]: E0218 05:48:49.469835 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:49 crc kubenswrapper[4869]: E0218 05:48:49.469951 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.638566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd"} Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.640950 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="1af47582b991b811381fa4bd09b1ec937c2055aeb04e798e36cac289d8ddb34c" exitCode=0 Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.641000 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"1af47582b991b811381fa4bd09b1ec937c2055aeb04e798e36cac289d8ddb34c"} Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.644505 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" event={"ID":"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d","Type":"ContainerStarted","Data":"b67a803341b02137617c0fc66c6fdcfab6f993de672e07d405f74bcc83ae8509"} Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.644589 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" event={"ID":"56f83cd7-4d21-48d4-b6d0-ae64ae0c4f1d","Type":"ContainerStarted","Data":"791e86f8f65b4a81c185dfee66527ac577fe99b5e80c1f1816ea9d774b1ac7da"} Feb 18 05:48:49 crc kubenswrapper[4869]: I0218 05:48:49.684767 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxmpq" podStartSLOduration=3.684709807 podStartE2EDuration="3.684709807s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:49.684641294 +0000 UTC m=+26.853729526" watchObservedRunningTime="2026-02-18 05:48:49.684709807 +0000 UTC m=+26.853798039" Feb 18 05:48:50 crc kubenswrapper[4869]: I0218 05:48:50.468428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:50 crc kubenswrapper[4869]: E0218 05:48:50.468623 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:50 crc kubenswrapper[4869]: E0218 05:48:50.469034 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs podName:1c4bcbdd-2490-4d47-b2b3-a2e832c63100 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:54.469014548 +0000 UTC m=+31.638102780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs") pod "network-metrics-daemon-ckzlt" (UID: "1c4bcbdd-2490-4d47-b2b3-a2e832c63100") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:50 crc kubenswrapper[4869]: I0218 05:48:50.469219 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:50 crc kubenswrapper[4869]: E0218 05:48:50.469351 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:50 crc kubenswrapper[4869]: I0218 05:48:50.651514 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="8dfdb3991a5c69d439b22e62d47340dd18abe356078fe0dd23a769f80b9a88ad" exitCode=0 Feb 18 05:48:50 crc kubenswrapper[4869]: I0218 05:48:50.651581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"8dfdb3991a5c69d439b22e62d47340dd18abe356078fe0dd23a769f80b9a88ad"} Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.175808 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.175985 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:59.175948546 +0000 UTC m=+36.345036778 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.277309 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.277389 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.277430 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.277468 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277617 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277640 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277655 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277716 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:59.277695822 +0000 UTC m=+36.446784074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277811 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277845 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:59.277835656 +0000 UTC m=+36.446923898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277903 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.278046 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.278180 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:59.278150044 +0000 UTC m=+36.447238466 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.277916 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.278361 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.278432 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:59.27841436 +0000 UTC m=+36.447502812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.469907 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.469939 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.470121 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.470124 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.470271 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:51 crc kubenswrapper[4869]: E0218 05:48:51.470481 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.657733 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6"} Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.660077 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="550096c79d229a2c3c542bee13ec487d28726d0af9f7832143c95e059793ca8f" exitCode=0 Feb 18 05:48:51 crc kubenswrapper[4869]: I0218 05:48:51.660113 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"550096c79d229a2c3c542bee13ec487d28726d0af9f7832143c95e059793ca8f"} Feb 18 05:48:52 crc kubenswrapper[4869]: I0218 05:48:52.470081 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:52 crc kubenswrapper[4869]: E0218 05:48:52.470261 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:52 crc kubenswrapper[4869]: I0218 05:48:52.671894 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="b5a946ae8cea1599047be94402cb42ef5d2ff738145d0efe6563ba39b9615b00" exitCode=0 Feb 18 05:48:52 crc kubenswrapper[4869]: I0218 05:48:52.671954 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"b5a946ae8cea1599047be94402cb42ef5d2ff738145d0efe6563ba39b9615b00"} Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.254670 4869 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.469987 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.470150 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.470207 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:53 crc kubenswrapper[4869]: E0218 05:48:53.471013 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:53 crc kubenswrapper[4869]: E0218 05:48:53.471127 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:53 crc kubenswrapper[4869]: E0218 05:48:53.471266 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.681872 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerStarted","Data":"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee"} Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.682382 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.687327 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6bdacde-403f-48ac-8366-547278b19432" containerID="f80ce3e94cf81c8e7752f33c4d27e414ca7f428be0b509841a7657e504ffe513" exitCode=0 Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.687382 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerDied","Data":"f80ce3e94cf81c8e7752f33c4d27e414ca7f428be0b509841a7657e504ffe513"} Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.714065 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:53 crc kubenswrapper[4869]: I0218 05:48:53.751616 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podStartSLOduration=7.751588342 podStartE2EDuration="7.751588342s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:53.724336919 +0000 UTC m=+30.893425181" watchObservedRunningTime="2026-02-18 05:48:53.751588342 +0000 UTC m=+30.920676574" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.469978 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:54 crc kubenswrapper[4869]: E0218 05:48:54.470137 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.515999 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:54 crc kubenswrapper[4869]: E0218 05:48:54.516141 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:54 crc kubenswrapper[4869]: E0218 05:48:54.516207 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs podName:1c4bcbdd-2490-4d47-b2b3-a2e832c63100 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.516187834 +0000 UTC m=+39.685276066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs") pod "network-metrics-daemon-ckzlt" (UID: "1c4bcbdd-2490-4d47-b2b3-a2e832c63100") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.695180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" event={"ID":"a6bdacde-403f-48ac-8366-547278b19432","Type":"ContainerStarted","Data":"470d11c5bdabc262c88686063d67c45e2bb79b61814d59b5c0cb1a1bd5049172"} Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.695259 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.695727 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.723164 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5kq9x" podStartSLOduration=8.723146122 podStartE2EDuration="8.723146122s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:54.722341602 +0000 UTC m=+31.891429914" watchObservedRunningTime="2026-02-18 05:48:54.723146122 +0000 UTC m=+31.892234354" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.727101 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:48:54 crc kubenswrapper[4869]: I0218 05:48:54.766382 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.325099 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckzlt"] Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.325240 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:55 crc kubenswrapper[4869]: E0218 05:48:55.325340 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.469499 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.469519 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.469590 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:55 crc kubenswrapper[4869]: E0218 05:48:55.469653 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:55 crc kubenswrapper[4869]: E0218 05:48:55.469835 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:55 crc kubenswrapper[4869]: E0218 05:48:55.469913 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:55 crc kubenswrapper[4869]: I0218 05:48:55.698230 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:56 crc kubenswrapper[4869]: I0218 05:48:56.701932 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:57 crc kubenswrapper[4869]: I0218 05:48:57.469372 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:57 crc kubenswrapper[4869]: I0218 05:48:57.469556 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:57 crc kubenswrapper[4869]: I0218 05:48:57.469640 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:57 crc kubenswrapper[4869]: I0218 05:48:57.469677 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:57 crc kubenswrapper[4869]: E0218 05:48:57.469683 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:57 crc kubenswrapper[4869]: E0218 05:48:57.469737 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:57 crc kubenswrapper[4869]: E0218 05:48:57.469987 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckzlt" podUID="1c4bcbdd-2490-4d47-b2b3-a2e832c63100" Feb 18 05:48:57 crc kubenswrapper[4869]: E0218 05:48:57.470074 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.409680 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.410012 4869 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.464927 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lljlj"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.465810 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.467907 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.468627 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.469157 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.469369 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.469677 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.471384 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.472152 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bft8b"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.473448 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.473539 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.474085 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.474458 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.474822 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.475218 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.477653 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.478587 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.481783 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.482807 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.484697 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.485584 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.485958 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.486834 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.487247 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.489011 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.489270 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.489498 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.491547 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.492155 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2gwhz"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.492684 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.493295 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.498551 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.498947 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499104 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499283 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499486 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499764 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499815 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.499939 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.500010 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.500065 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.500230 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.500403 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.501048 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.503148 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.503396 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.503551 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.503667 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.503893 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.504003 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.504113 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.504227 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.504334 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.504967 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tthlh"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.505591 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.509898 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.510480 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.510809 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.512079 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.532033 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.534087 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.538078 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.551509 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.552648 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.552673 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mx9zm"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.552827 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.552973 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.553159 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.553210 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.553771 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.554171 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.554687 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.554703 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.554926 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.554930 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.555295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.562235 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.565640 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-audit-dir\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567062 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29h4t\" (UniqueName: \"kubernetes.io/projected/7411fb2b-cd62-452d-a5c8-94135752329d-kube-api-access-29h4t\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567096 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567120 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-serving-cert\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567145 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567191 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567217 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tn57\" (UniqueName: \"kubernetes.io/projected/f43ce73f-2618-4c47-9304-7db1e67db22f-kube-api-access-7tn57\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567243 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-serving-cert\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567287 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccv8w\" (UniqueName: \"kubernetes.io/projected/8f7d294e-1e88-4265-8ad2-3344b7148caa-kube-api-access-ccv8w\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567310 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567334 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkch\" (UniqueName: \"kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567355 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567374 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgn7\" (UniqueName: \"kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567396 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567418 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567439 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567460 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgzp\" (UniqueName: \"kubernetes.io/projected/2e1a4023-11b5-47b3-886a-c37ecabc0103-kube-api-access-8dgzp\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567483 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7d294e-1e88-4265-8ad2-3344b7148caa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567506 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzk27\" (UniqueName: \"kubernetes.io/projected/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-kube-api-access-fzk27\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567529 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-encryption-config\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567538 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567550 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-audit\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567575 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtnpg\" (UniqueName: \"kubernetes.io/projected/d13ea6d8-477e-4add-b9dc-f8cac9eb0b01-kube-api-access-gtnpg\") pod \"downloads-7954f5f757-mx9zm\" (UID: \"d13ea6d8-477e-4add-b9dc-f8cac9eb0b01\") " pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567647 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567687 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-policies\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567712 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7411fb2b-cd62-452d-a5c8-94135752329d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567735 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567781 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e1a4023-11b5-47b3-886a-c37ecabc0103-machine-approver-tls\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567810 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7d7w\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-kube-api-access-d7d7w\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567831 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567853 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567890 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7d294e-1e88-4265-8ad2-3344b7148caa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567912 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567936 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567957 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-client\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.567983 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-node-pullsecrets\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568041 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-dir\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568065 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fsf\" (UniqueName: \"kubernetes.io/projected/b8d17264-7c40-4eb6-82eb-f4020e635dde-kube-api-access-27fsf\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568088 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-config\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568111 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568135 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f43ce73f-2618-4c47-9304-7db1e67db22f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568155 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-image-import-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568179 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41f2c352-c537-4c3b-b206-ef73eba45593-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568224 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-etcd-client\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568248 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568274 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-auth-proxy-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568296 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-etcd-serving-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568345 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568366 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568386 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdv5\" (UniqueName: \"kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568408 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknpx\" (UniqueName: \"kubernetes.io/projected/33549d64-6034-4f60-b254-2729e899e541-kube-api-access-cknpx\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568428 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568450 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568484 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-config\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568515 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568540 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568564 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-service-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568587 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f2c352-c537-4c3b-b206-ef73eba45593-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568632 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568655 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568675 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-images\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568697 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568721 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43ce73f-2618-4c47-9304-7db1e67db22f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568789 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5hr\" (UniqueName: \"kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568813 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568846 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-serving-cert\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568867 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-encryption-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.568894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.569581 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.569692 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.569777 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.570231 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.570894 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.571017 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.571827 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mqfwk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.572647 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.573943 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.574186 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.574412 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.577087 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.577478 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.577769 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.577919 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.578825 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.577950 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.579324 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.579506 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.580177 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.580303 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.580424 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.583820 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pg7gk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.584523 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.584724 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fhpv8"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.585116 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.588381 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.588492 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.588699 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.588922 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.589046 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.589052 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.591813 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.591914 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593081 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593288 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593386 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593481 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593571 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593649 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593760 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.593866 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.624670 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.629647 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.650144 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.650419 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652044 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652184 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652261 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652650 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652704 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.652950 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653082 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653107 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653238 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653362 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653460 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.653616 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.655146 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.655972 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.656481 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.658129 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.658652 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.659056 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.659375 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.659779 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.660023 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.660198 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.660203 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.660381 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.662113 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.662812 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.663828 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.664316 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.664646 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.664669 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cnzqw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.664856 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.665100 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.665636 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.667876 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.668766 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.668926 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.669283 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.672156 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.672705 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.677373 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.677614 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.678886 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.678922 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.678956 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-bound-sa-token\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.678990 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7d294e-1e88-4265-8ad2-3344b7148caa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679011 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-client\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679061 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-node-pullsecrets\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679082 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fsf\" (UniqueName: \"kubernetes.io/projected/b8d17264-7c40-4eb6-82eb-f4020e635dde-kube-api-access-27fsf\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679114 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679134 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-dir\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679150 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-config\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679167 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679186 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f43ce73f-2618-4c47-9304-7db1e67db22f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679208 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-image-import-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679234 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-client\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679254 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41f2c352-c537-4c3b-b206-ef73eba45593-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679270 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-etcd-client\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679287 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679306 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679323 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-auth-proxy-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679352 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679370 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdv5\" (UniqueName: \"kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679403 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-etcd-serving-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679429 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679449 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679469 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknpx\" (UniqueName: \"kubernetes.io/projected/33549d64-6034-4f60-b254-2729e899e541-kube-api-access-cknpx\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679488 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvln\" (UniqueName: \"kubernetes.io/projected/f39f9dc9-3127-4302-93dd-faf8bb814b58-kube-api-access-8tvln\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679511 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-config\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679551 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679587 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679610 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679633 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-service-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679653 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-service-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679701 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f2c352-c537-4c3b-b206-ef73eba45593-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679781 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-images\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679833 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679854 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43ce73f-2618-4c47-9304-7db1e67db22f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679872 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679892 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f96a9f3e-b511-4f78-83fd-a134aa3d5106-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679909 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-config\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679927 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5hr\" (UniqueName: \"kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679962 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-encryption-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.679986 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-serving-cert\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680012 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tz77\" (UniqueName: \"kubernetes.io/projected/81e0ea07-506f-4560-90de-b5ae7675113f-kube-api-access-8tz77\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680030 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a55f3d-af8c-4589-a9b0-22b324a77524-trusted-ca\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680066 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-serving-cert\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680083 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2a55f3d-af8c-4589-a9b0-22b324a77524-metrics-tls\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-config\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680121 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f66291-128a-4347-9c2b-e8a1f25aea67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680139 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-default-certificate\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680157 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-serving-cert\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680212 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-audit-dir\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680236 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-stats-auth\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29h4t\" (UniqueName: \"kubernetes.io/projected/7411fb2b-cd62-452d-a5c8-94135752329d-kube-api-access-29h4t\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680280 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680302 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-metrics-certs\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680334 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680358 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680375 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tn57\" (UniqueName: \"kubernetes.io/projected/f43ce73f-2618-4c47-9304-7db1e67db22f-kube-api-access-7tn57\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680424 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-serving-cert\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680440 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccv8w\" (UniqueName: \"kubernetes.io/projected/8f7d294e-1e88-4265-8ad2-3344b7148caa-kube-api-access-ccv8w\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680456 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680472 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680491 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkch\" (UniqueName: \"kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgn7\" (UniqueName: \"kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680526 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680544 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680562 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqgw\" (UniqueName: \"kubernetes.io/projected/f96a9f3e-b511-4f78-83fd-a134aa3d5106-kube-api-access-njqgw\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f66291-128a-4347-9c2b-e8a1f25aea67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680596 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgzp\" (UniqueName: \"kubernetes.io/projected/2e1a4023-11b5-47b3-886a-c37ecabc0103-kube-api-access-8dgzp\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680614 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680631 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzhd\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-kube-api-access-4dzhd\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680647 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l85k\" (UniqueName: \"kubernetes.io/projected/61805635-7d39-4980-b1be-18cf8f05074d-kube-api-access-2l85k\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680673 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7d294e-1e88-4265-8ad2-3344b7148caa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680689 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzk27\" (UniqueName: \"kubernetes.io/projected/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-kube-api-access-fzk27\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680707 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-encryption-config\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-audit\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680756 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtnpg\" (UniqueName: \"kubernetes.io/projected/d13ea6d8-477e-4add-b9dc-f8cac9eb0b01-kube-api-access-gtnpg\") pod \"downloads-7954f5f757-mx9zm\" (UID: \"d13ea6d8-477e-4add-b9dc-f8cac9eb0b01\") " pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680772 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-policies\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680790 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680807 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61805635-7d39-4980-b1be-18cf8f05074d-service-ca-bundle\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680833 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-trusted-ca\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680851 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7411fb2b-cd62-452d-a5c8-94135752329d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680882 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680898 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e1a4023-11b5-47b3-886a-c37ecabc0103-machine-approver-tls\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680914 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39f9dc9-3127-4302-93dd-faf8bb814b58-serving-cert\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680932 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7d7w\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-kube-api-access-d7d7w\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.680966 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gb6\" (UniqueName: \"kubernetes.io/projected/71f66291-128a-4347-9c2b-e8a1f25aea67-kube-api-access-c5gb6\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.683166 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.683457 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.683955 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.683965 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.683816 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.684183 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-audit-dir\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.684684 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.684989 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.685041 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.685216 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.685721 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.689706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f43ce73f-2618-4c47-9304-7db1e67db22f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.693708 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7d294e-1e88-4265-8ad2-3344b7148caa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.694379 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-config\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.694990 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.695499 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.698024 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e1a4023-11b5-47b3-886a-c37ecabc0103-machine-approver-tls\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.698811 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.699325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-policies\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.699400 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.699839 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-etcd-serving-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.699859 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-service-ca-bundle\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.700291 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.700491 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.700954 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.703976 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.704022 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33549d64-6034-4f60-b254-2729e899e541-node-pullsecrets\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.704973 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.705359 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.705887 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7d294e-1e88-4265-8ad2-3344b7148caa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.706472 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-image-import-ca\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.707871 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41f2c352-c537-4c3b-b206-ef73eba45593-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.709802 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.710542 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.710984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33549d64-6034-4f60-b254-2729e899e541-audit\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.732347 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d17264-7c40-4eb6-82eb-f4020e635dde-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.711835 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8d17264-7c40-4eb6-82eb-f4020e635dde-audit-dir\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.712520 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.713048 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.713217 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-images\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.713456 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.713893 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.714144 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.714445 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.730904 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1a4023-11b5-47b3-886a-c37ecabc0103-auth-proxy-config\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.731429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.712427 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411fb2b-cd62-452d-a5c8-94135752329d-config\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.734532 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.734633 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.735921 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.737529 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-etcd-client\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.737574 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.740305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-encryption-config\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.740355 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.741349 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f2c352-c537-4c3b-b206-ef73eba45593-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.741545 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-serving-cert\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.741703 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-encryption-config\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.742791 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33549d64-6034-4f60-b254-2729e899e541-etcd-client\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.743415 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.743523 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7411fb2b-cd62-452d-a5c8-94135752329d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.744330 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.744333 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43ce73f-2618-4c47-9304-7db1e67db22f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.744455 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d17264-7c40-4eb6-82eb-f4020e635dde-serving-cert\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.744952 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.748434 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.750633 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-serving-cert\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.753643 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.753802 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.754582 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.754906 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.754960 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.755258 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.755346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.755242 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jc49r"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.756530 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.757532 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tzbsp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.758482 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.759172 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.760294 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.761032 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8xm9f"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.761854 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.761936 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lljlj"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.762839 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t9nfk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.763396 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.763796 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.764729 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bft8b"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.765678 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2gwhz"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.766595 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.767473 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.767782 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.768318 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.769370 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h6w89"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.770392 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.770639 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h6w89" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.770975 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mqfwk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.771806 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.772655 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.773512 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.774362 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.775217 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.776028 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mx9zm"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.776986 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.777734 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.779514 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.780384 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cnzqw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.781244 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0750f-52c1-450d-8f18-d068333e7bc3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782064 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-client\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782092 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782167 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wt67\" (UniqueName: \"kubernetes.io/projected/67eed62e-4836-47bb-96d0-cd6668d64a78-kube-api-access-5wt67\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782201 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782311 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-service-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782351 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnpfw\" (UniqueName: \"kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782421 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-config\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782478 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a55f3d-af8c-4589-a9b0-22b324a77524-trusted-ca\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782499 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-serving-cert\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tz77\" (UniqueName: \"kubernetes.io/projected/81e0ea07-506f-4560-90de-b5ae7675113f-kube-api-access-8tz77\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782560 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-config\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782582 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f66291-128a-4347-9c2b-e8a1f25aea67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782606 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782656 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dsp2\" (UniqueName: \"kubernetes.io/projected/79b505d9-7c7c-4ef4-a689-e2b80d855a09-kube-api-access-7dsp2\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16d0750f-52c1-450d-8f18-d068333e7bc3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782722 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/67eed62e-4836-47bb-96d0-cd6668d64a78-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782768 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbxr\" (UniqueName: \"kubernetes.io/projected/d3b34965-6f66-4f22-b655-05ae645a8f19-kube-api-access-ffbxr\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782796 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782817 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjpw\" (UniqueName: \"kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782849 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61805635-7d39-4980-b1be-18cf8f05074d-service-ca-bundle\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782874 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-trusted-ca\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782899 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gb6\" (UniqueName: \"kubernetes.io/projected/71f66291-128a-4347-9c2b-e8a1f25aea67-kube-api-access-c5gb6\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782921 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-bound-sa-token\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782964 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d0750f-52c1-450d-8f18-d068333e7bc3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.782986 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9ssv\" (UniqueName: \"kubernetes.io/projected/84302a1e-f497-4332-8736-5e53c6741081-kube-api-access-w9ssv\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783008 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783034 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b34965-6f66-4f22-b655-05ae645a8f19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783072 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvln\" (UniqueName: \"kubernetes.io/projected/f39f9dc9-3127-4302-93dd-faf8bb814b58-kube-api-access-8tvln\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783109 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b635dd52-43a3-4285-a6f9-86b872db734a-config\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783133 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq47d\" (UniqueName: \"kubernetes.io/projected/b635dd52-43a3-4285-a6f9-86b872db734a-kube-api-access-lq47d\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783171 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f96a9f3e-b511-4f78-83fd-a134aa3d5106-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783196 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783217 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783238 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b635dd52-43a3-4285-a6f9-86b872db734a-serving-cert\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783257 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b34965-6f66-4f22-b655-05ae645a8f19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783291 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2a55f3d-af8c-4589-a9b0-22b324a77524-metrics-tls\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-default-certificate\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783333 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-stats-auth\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783351 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-metrics-certs\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783370 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-kube-api-access-5x5jz\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783409 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqgw\" (UniqueName: \"kubernetes.io/projected/f96a9f3e-b511-4f78-83fd-a134aa3d5106-kube-api-access-njqgw\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783427 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f66291-128a-4347-9c2b-e8a1f25aea67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783457 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzhd\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-kube-api-access-4dzhd\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783477 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l85k\" (UniqueName: \"kubernetes.io/projected/61805635-7d39-4980-b1be-18cf8f05074d-kube-api-access-2l85k\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783502 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311f547d-0650-427d-8a3d-31bfa0b56403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783520 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783549 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/311f547d-0650-427d-8a3d-31bfa0b56403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783568 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39f9dc9-3127-4302-93dd-faf8bb814b58-serving-cert\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783586 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311f547d-0650-427d-8a3d-31bfa0b56403-config\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.783775 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.784278 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f66291-128a-4347-9c2b-e8a1f25aea67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.784480 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a55f3d-af8c-4589-a9b0-22b324a77524-trusted-ca\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.784988 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tthlh"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.785425 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-trusted-ca\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.785541 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39f9dc9-3127-4302-93dd-faf8bb814b58-config\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.785891 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zc447"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.786693 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.786933 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.787623 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.788012 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.788576 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59xq2"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.789503 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-stats-auth\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.789872 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.789974 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.790146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2a55f3d-af8c-4589-a9b0-22b324a77524-metrics-tls\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.790158 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f66291-128a-4347-9c2b-e8a1f25aea67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.790781 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-default-certificate\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.790895 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f96a9f3e-b511-4f78-83fd-a134aa3d5106-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.791634 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39f9dc9-3127-4302-93dd-faf8bb814b58-serving-cert\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.791652 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.792914 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.796622 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.797759 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61805635-7d39-4980-b1be-18cf8f05074d-metrics-certs\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.799436 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jc49r"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.800947 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t9nfk"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.801684 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.805838 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fhpv8"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.807258 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.808203 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.808509 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.810045 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.810904 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tzbsp"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.811849 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h6w89"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.812826 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59xq2"] Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.827983 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.834970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61805635-7d39-4980-b1be-18cf8f05074d-service-ca-bundle\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.849107 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.868137 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.874964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-service-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wt67\" (UniqueName: \"kubernetes.io/projected/67eed62e-4836-47bb-96d0-cd6668d64a78-kube-api-access-5wt67\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884471 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnpfw\" (UniqueName: \"kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884556 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884683 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dsp2\" (UniqueName: \"kubernetes.io/projected/79b505d9-7c7c-4ef4-a689-e2b80d855a09-kube-api-access-7dsp2\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884724 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16d0750f-52c1-450d-8f18-d068333e7bc3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/67eed62e-4836-47bb-96d0-cd6668d64a78-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884812 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884839 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbxr\" (UniqueName: \"kubernetes.io/projected/d3b34965-6f66-4f22-b655-05ae645a8f19-kube-api-access-ffbxr\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884868 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjpw\" (UniqueName: \"kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884945 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d0750f-52c1-450d-8f18-d068333e7bc3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.884978 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885005 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9ssv\" (UniqueName: \"kubernetes.io/projected/84302a1e-f497-4332-8736-5e53c6741081-kube-api-access-w9ssv\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885034 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b34965-6f66-4f22-b655-05ae645a8f19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885081 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b635dd52-43a3-4285-a6f9-86b872db734a-config\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885135 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq47d\" (UniqueName: \"kubernetes.io/projected/b635dd52-43a3-4285-a6f9-86b872db734a-kube-api-access-lq47d\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885163 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885192 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b34965-6f66-4f22-b655-05ae645a8f19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885249 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b635dd52-43a3-4285-a6f9-86b872db734a-serving-cert\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885292 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-kube-api-access-5x5jz\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885416 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311f547d-0650-427d-8a3d-31bfa0b56403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885457 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311f547d-0650-427d-8a3d-31bfa0b56403-config\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885481 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/311f547d-0650-427d-8a3d-31bfa0b56403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0750f-52c1-450d-8f18-d068333e7bc3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.885590 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.888811 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.908187 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.916293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-serving-cert\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.928366 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.935055 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-client\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.948411 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.968287 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.988044 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 05:48:58 crc kubenswrapper[4869]: I0218 05:48:58.994563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-config\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.008659 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.013649 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/81e0ea07-506f-4560-90de-b5ae7675113f-etcd-ca\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.048310 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.068727 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.088762 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.107822 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.128257 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.147851 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.159591 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0750f-52c1-450d-8f18-d068333e7bc3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.169101 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.176534 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d0750f-52c1-450d-8f18-d068333e7bc3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.188272 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.190808 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.191032 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:15.190988139 +0000 UTC m=+52.360076391 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.209947 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.219050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311f547d-0650-427d-8a3d-31bfa0b56403-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.228249 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.236656 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311f547d-0650-427d-8a3d-31bfa0b56403-config\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.248156 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.268023 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.287634 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.293125 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.293216 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.293278 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293420 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293443 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293467 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293501 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293533 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293546 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:15.293514395 +0000 UTC m=+52.462602657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293593 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:15.293565586 +0000 UTC m=+52.462653858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293626 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:15.293612747 +0000 UTC m=+52.462701019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.293457 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293676 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293693 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293707 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.293778 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:15.29373267 +0000 UTC m=+52.462820912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.308208 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.361715 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.362371 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.370187 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.389484 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.408193 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.427834 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.448766 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.467884 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.469930 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.470005 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.470035 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.470205 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.489089 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.498009 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b34965-6f66-4f22-b655-05ae645a8f19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.508218 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.516620 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b34965-6f66-4f22-b655-05ae645a8f19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.527925 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.548686 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.568478 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.580642 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/67eed62e-4836-47bb-96d0-cd6668d64a78-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.588566 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.609091 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.628235 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.648676 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.660103 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b635dd52-43a3-4285-a6f9-86b872db734a-serving-cert\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.667830 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.676044 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b635dd52-43a3-4285-a6f9-86b872db734a-config\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.686466 4869 request.go:700] Waited for 1.01714306s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.688784 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.709954 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.719381 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.720484 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.731507 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.766126 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgn7\" (UniqueName: \"kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7\") pod \"oauth-openshift-558db77b4-tthlh\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.789293 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.798346 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.799024 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccv8w\" (UniqueName: \"kubernetes.io/projected/8f7d294e-1e88-4265-8ad2-3344b7148caa-kube-api-access-ccv8w\") pod \"openshift-apiserver-operator-796bbdcf4f-c5v7h\" (UID: \"8f7d294e-1e88-4265-8ad2-3344b7148caa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.809642 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.829667 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.848661 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.856728 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.868987 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.881701 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885300 4869 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885317 4869 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885395 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume podName:c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf nodeName:}" failed. No retries permitted until 2026-02-18 05:49:00.385369932 +0000 UTC m=+37.554458164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume") pod "collect-profiles-29523225-j2h97" (UID: "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf") : failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885447 4869 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885486 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert podName:79b505d9-7c7c-4ef4-a689-e2b80d855a09 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:00.385440314 +0000 UTC m=+37.554528576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert") pod "catalog-operator-68c6474976-b5d72" (UID: "79b505d9-7c7c-4ef4-a689-e2b80d855a09") : failed to sync secret cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885526 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key podName:84302a1e-f497-4332-8736-5e53c6741081 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:00.385506035 +0000 UTC m=+37.554594497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key") pod "service-ca-9c57cc56f-tzbsp" (UID: "84302a1e-f497-4332-8736-5e53c6741081") : failed to sync secret cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885550 4869 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885603 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca podName:a5c8a86f-e3a2-4088-9839-386b9dc56d03 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:00.385586877 +0000 UTC m=+37.554675139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca") pod "marketplace-operator-79b997595-zd46x" (UID: "a5c8a86f-e3a2-4088-9839-386b9dc56d03") : failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885801 4869 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: E0218 05:48:59.885843 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle podName:84302a1e-f497-4332-8736-5e53c6741081 nodeName:}" failed. No retries permitted until 2026-02-18 05:49:00.385835153 +0000 UTC m=+37.554923385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle") pod "service-ca-9c57cc56f-tzbsp" (UID: "84302a1e-f497-4332-8736-5e53c6741081") : failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.889845 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.907639 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.944498 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.950109 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.968883 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.972828 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:48:59 crc kubenswrapper[4869]: I0218 05:48:59.988128 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.008817 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.028870 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.063724 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.066625 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.083426 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknpx\" (UniqueName: \"kubernetes.io/projected/33549d64-6034-4f60-b254-2729e899e541-kube-api-access-cknpx\") pod \"apiserver-76f77b778f-bft8b\" (UID: \"33549d64-6034-4f60-b254-2729e899e541\") " pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.106444 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgzp\" (UniqueName: \"kubernetes.io/projected/2e1a4023-11b5-47b3-886a-c37ecabc0103-kube-api-access-8dgzp\") pod \"machine-approver-56656f9798-bdj4n\" (UID: \"2e1a4023-11b5-47b3-886a-c37ecabc0103\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.123606 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fsf\" (UniqueName: \"kubernetes.io/projected/b8d17264-7c40-4eb6-82eb-f4020e635dde-kube-api-access-27fsf\") pod \"apiserver-7bbb656c7d-stbsw\" (UID: \"b8d17264-7c40-4eb6-82eb-f4020e635dde\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.126243 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.144211 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzk27\" (UniqueName: \"kubernetes.io/projected/f7ad651f-69ed-4f4f-9792-6f6a1e635edc-kube-api-access-fzk27\") pod \"authentication-operator-69f744f599-2gwhz\" (UID: \"f7ad651f-69ed-4f4f-9792-6f6a1e635edc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.155041 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tthlh"] Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.163072 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd72ef0c5_fb30_4d98_9237_a992acf49959.slice/crio-381edd3bf077c5e6352d06f632c3e6af54b1281337157000ed4a2d5506f4a5ad WatchSource:0}: Error finding container 381edd3bf077c5e6352d06f632c3e6af54b1281337157000ed4a2d5506f4a5ad: Status 404 returned error can't find the container with id 381edd3bf077c5e6352d06f632c3e6af54b1281337157000ed4a2d5506f4a5ad Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.163500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdv5\" (UniqueName: \"kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5\") pod \"console-f9d7485db-tc859\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.184046 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.184735 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tn57\" (UniqueName: \"kubernetes.io/projected/f43ce73f-2618-4c47-9304-7db1e67db22f-kube-api-access-7tn57\") pod \"openshift-config-operator-7777fb866f-fbl5d\" (UID: \"f43ce73f-2618-4c47-9304-7db1e67db22f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.211208 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7d7w\" (UniqueName: \"kubernetes.io/projected/41f2c352-c537-4c3b-b206-ef73eba45593-kube-api-access-d7d7w\") pod \"cluster-image-registry-operator-dc59b4c8b-wgv88\" (UID: \"41f2c352-c537-4c3b-b206-ef73eba45593\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.223577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtnpg\" (UniqueName: \"kubernetes.io/projected/d13ea6d8-477e-4add-b9dc-f8cac9eb0b01-kube-api-access-gtnpg\") pod \"downloads-7954f5f757-mx9zm\" (UID: \"d13ea6d8-477e-4add-b9dc-f8cac9eb0b01\") " pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.245177 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkch\" (UniqueName: \"kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch\") pod \"route-controller-manager-6576b87f9c-b7522\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.263710 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29h4t\" (UniqueName: \"kubernetes.io/projected/7411fb2b-cd62-452d-a5c8-94135752329d-kube-api-access-29h4t\") pod \"machine-api-operator-5694c8668f-lljlj\" (UID: \"7411fb2b-cd62-452d-a5c8-94135752329d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.265492 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.280994 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.285782 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5hr\" (UniqueName: \"kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr\") pod \"controller-manager-879f6c89f-9m6gl\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.288347 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.290033 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bft8b"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.290510 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.302058 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.305045 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.305446 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.308794 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.323569 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1a4023_11b5_47b3_886a_c37ecabc0103.slice/crio-921d3f193ff7cff364b35d8ae7d6b785a01df2d67fb30023bcbf73bd8d2d86e0 WatchSource:0}: Error finding container 921d3f193ff7cff364b35d8ae7d6b785a01df2d67fb30023bcbf73bd8d2d86e0: Status 404 returned error can't find the container with id 921d3f193ff7cff364b35d8ae7d6b785a01df2d67fb30023bcbf73bd8d2d86e0 Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.328405 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.332792 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.349408 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.370357 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.371208 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2gwhz"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.418635 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.418764 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.418811 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.421533 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.421772 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.422111 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.422382 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.422692 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.423184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.423607 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.434442 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.449075 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.455493 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84302a1e-f497-4332-8736-5e53c6741081-signing-cabundle\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.468226 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.472018 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.480549 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84302a1e-f497-4332-8736-5e53c6741081-signing-key\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.491049 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.508968 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.533575 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.551003 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.555589 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b505d9-7c7c-4ef4-a689-e2b80d855a09-srv-cert\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.569003 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.591057 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.602076 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mx9zm"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.609025 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.631911 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.637478 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f2c352_c537_4c3b_b206_ef73eba45593.slice/crio-5597acf5d56890a0f861cf16678b9c25b030c0f8111093194de56ffea4b20972 WatchSource:0}: Error finding container 5597acf5d56890a0f861cf16678b9c25b030c0f8111093194de56ffea4b20972: Status 404 returned error can't find the container with id 5597acf5d56890a0f861cf16678b9c25b030c0f8111093194de56ffea4b20972 Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.639544 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13ea6d8_477e_4add_b9dc_f8cac9eb0b01.slice/crio-3c7946e7d7d78e1ae7722427b79130f416d7d07fad5b8d7f18d4aadfd71cf1d7 WatchSource:0}: Error finding container 3c7946e7d7d78e1ae7722427b79130f416d7d07fad5b8d7f18d4aadfd71cf1d7: Status 404 returned error can't find the container with id 3c7946e7d7d78e1ae7722427b79130f416d7d07fad5b8d7f18d4aadfd71cf1d7 Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.649060 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.669115 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.688524 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.698434 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lljlj"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.707010 4869 request.go:700] Waited for 1.935462573s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.709434 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.719929 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7411fb2b_cd62_452d_a5c8_94135752329d.slice/crio-ea662b92f9da5648dc3a9e875f17be4b5a0fe585d44182f8a8362e68ba309f95 WatchSource:0}: Error finding container ea662b92f9da5648dc3a9e875f17be4b5a0fe585d44182f8a8362e68ba309f95: Status 404 returned error can't find the container with id ea662b92f9da5648dc3a9e875f17be4b5a0fe585d44182f8a8362e68ba309f95 Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.728534 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.746781 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" event={"ID":"41f2c352-c537-4c3b-b206-ef73eba45593","Type":"ContainerStarted","Data":"5597acf5d56890a0f861cf16678b9c25b030c0f8111093194de56ffea4b20972"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.748761 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" event={"ID":"7411fb2b-cd62-452d-a5c8-94135752329d","Type":"ContainerStarted","Data":"ea662b92f9da5648dc3a9e875f17be4b5a0fe585d44182f8a8362e68ba309f95"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.749119 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.749849 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" event={"ID":"33549d64-6034-4f60-b254-2729e899e541","Type":"ContainerStarted","Data":"d74fe7c37cc0d2c1ab6deffcc5a5850a3f42c236b5f0b5e9c5b5686a567ce30a"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.751170 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" event={"ID":"d72ef0c5-fb30-4d98-9237-a992acf49959","Type":"ContainerStarted","Data":"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.751200 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" event={"ID":"d72ef0c5-fb30-4d98-9237-a992acf49959","Type":"ContainerStarted","Data":"381edd3bf077c5e6352d06f632c3e6af54b1281337157000ed4a2d5506f4a5ad"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.751859 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.753088 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mx9zm" event={"ID":"d13ea6d8-477e-4add-b9dc-f8cac9eb0b01","Type":"ContainerStarted","Data":"3c7946e7d7d78e1ae7722427b79130f416d7d07fad5b8d7f18d4aadfd71cf1d7"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.755089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" event={"ID":"f7ad651f-69ed-4f4f-9792-6f6a1e635edc","Type":"ContainerStarted","Data":"fe4e121a1010dd72afd2a3c7bb07002a076339bdd0e16b87f69868d84c905e3f"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.755130 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" event={"ID":"f7ad651f-69ed-4f4f-9792-6f6a1e635edc","Type":"ContainerStarted","Data":"cb28b679b16937d5eb22fc414aea2287c4530e6d1191d76248bd82a33f8ea1a6"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.763488 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" event={"ID":"8f7d294e-1e88-4265-8ad2-3344b7148caa","Type":"ContainerStarted","Data":"27d9755aab233a4e5e0ef5663f4dd359a893b44c5e53a9d1cd132afd2a6e2841"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.763540 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" event={"ID":"8f7d294e-1e88-4265-8ad2-3344b7148caa","Type":"ContainerStarted","Data":"57763cf2bca7fa3213a2b9d9af6fdd1d507919c291519df1285627feccf76c8d"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.766068 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" event={"ID":"2e1a4023-11b5-47b3-886a-c37ecabc0103","Type":"ContainerStarted","Data":"921d3f193ff7cff364b35d8ae7d6b785a01df2d67fb30023bcbf73bd8d2d86e0"} Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.766950 4869 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tthlh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.766995 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.790905 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tz77\" (UniqueName: \"kubernetes.io/projected/81e0ea07-506f-4560-90de-b5ae7675113f-kube-api-access-8tz77\") pod \"etcd-operator-b45778765-fhpv8\" (UID: \"81e0ea07-506f-4560-90de-b5ae7675113f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.813243 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.820247 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqgw\" (UniqueName: \"kubernetes.io/projected/f96a9f3e-b511-4f78-83fd-a134aa3d5106-kube-api-access-njqgw\") pod \"cluster-samples-operator-665b6dd947-p8b4x\" (UID: \"f96a9f3e-b511-4f78-83fd-a134aa3d5106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:49:00 crc kubenswrapper[4869]: W0218 05:49:00.822117 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97572f7e_7ea7_4b5a_b1d4_cd132f0c34a2.slice/crio-2c90b0b6a107b8b2849e81696509f4d01f642eeedd2c6cec1152eabb918f5b40 WatchSource:0}: Error finding container 2c90b0b6a107b8b2849e81696509f4d01f642eeedd2c6cec1152eabb918f5b40: Status 404 returned error can't find the container with id 2c90b0b6a107b8b2849e81696509f4d01f642eeedd2c6cec1152eabb918f5b40 Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.830977 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gb6\" (UniqueName: \"kubernetes.io/projected/71f66291-128a-4347-9c2b-e8a1f25aea67-kube-api-access-c5gb6\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzw2f\" (UID: \"71f66291-128a-4347-9c2b-e8a1f25aea67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.844447 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-bound-sa-token\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.869788 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvln\" (UniqueName: \"kubernetes.io/projected/f39f9dc9-3127-4302-93dd-faf8bb814b58-kube-api-access-8tvln\") pod \"console-operator-58897d9998-mqfwk\" (UID: \"f39f9dc9-3127-4302-93dd-faf8bb814b58\") " pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.882733 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzhd\" (UniqueName: \"kubernetes.io/projected/c2a55f3d-af8c-4589-a9b0-22b324a77524-kube-api-access-4dzhd\") pod \"ingress-operator-5b745b69d9-twbsv\" (UID: \"c2a55f3d-af8c-4589-a9b0-22b324a77524\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.909550 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.912635 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l85k\" (UniqueName: \"kubernetes.io/projected/61805635-7d39-4980-b1be-18cf8f05074d-kube-api-access-2l85k\") pod \"router-default-5444994796-pg7gk\" (UID: \"61805635-7d39-4980-b1be-18cf8f05074d\") " pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.912933 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.920866 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.924807 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.926543 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.929884 4869 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.932077 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.938896 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.942327 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw"] Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.944633 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.949453 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 05:49:00 crc kubenswrapper[4869]: I0218 05:49:00.978844 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.008633 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wt67\" (UniqueName: \"kubernetes.io/projected/67eed62e-4836-47bb-96d0-cd6668d64a78-kube-api-access-5wt67\") pod \"multus-admission-controller-857f4d67dd-cnzqw\" (UID: \"67eed62e-4836-47bb-96d0-cd6668d64a78\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.019464 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.025840 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnpfw\" (UniqueName: \"kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw\") pod \"collect-profiles-29523225-j2h97\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.055109 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16d0750f-52c1-450d-8f18-d068333e7bc3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vpxmk\" (UID: \"16d0750f-52c1-450d-8f18-d068333e7bc3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.062132 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.065190 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.066436 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dsp2\" (UniqueName: \"kubernetes.io/projected/79b505d9-7c7c-4ef4-a689-e2b80d855a09-kube-api-access-7dsp2\") pod \"catalog-operator-68c6474976-b5d72\" (UID: \"79b505d9-7c7c-4ef4-a689-e2b80d855a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.084978 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbxr\" (UniqueName: \"kubernetes.io/projected/d3b34965-6f66-4f22-b655-05ae645a8f19-kube-api-access-ffbxr\") pod \"kube-storage-version-migrator-operator-b67b599dd-2hvk6\" (UID: \"d3b34965-6f66-4f22-b655-05ae645a8f19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.086480 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.106933 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9ssv\" (UniqueName: \"kubernetes.io/projected/84302a1e-f497-4332-8736-5e53c6741081-kube-api-access-w9ssv\") pod \"service-ca-9c57cc56f-tzbsp\" (UID: \"84302a1e-f497-4332-8736-5e53c6741081\") " pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.128919 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjpw\" (UniqueName: \"kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw\") pod \"marketplace-operator-79b997595-zd46x\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.152446 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq47d\" (UniqueName: \"kubernetes.io/projected/b635dd52-43a3-4285-a6f9-86b872db734a-kube-api-access-lq47d\") pod \"service-ca-operator-777779d784-zmmhg\" (UID: \"b635dd52-43a3-4285-a6f9-86b872db734a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.168838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/be5b136e-e664-4ed5-9fbd-e2a9bdd06db9-kube-api-access-5x5jz\") pod \"control-plane-machine-set-operator-78cbb6b69f-pmqtp\" (UID: \"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.187129 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/311f547d-0650-427d-8a3d-31bfa0b56403-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zkqc9\" (UID: \"311f547d-0650-427d-8a3d-31bfa0b56403\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.209367 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.231195 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.244934 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.253702 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.255242 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.261908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.271808 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.287997 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.290381 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.309928 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.311400 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.312849 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.326117 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.333372 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.337672 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-webhook-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.337761 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lmq\" (UniqueName: \"kubernetes.io/projected/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-kube-api-access-m5lmq\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.337785 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-images\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.339856 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.339885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65822bd2-50a3-41f3-b50d-ec86090ce4d4-tmpfs\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.339926 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptp5p\" (UniqueName: \"kubernetes.io/projected/65822bd2-50a3-41f3-b50d-ec86090ce4d4-kube-api-access-ptp5p\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.339953 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340269 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340319 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s2lw\" (UniqueName: \"kubernetes.io/projected/98621832-7945-48e2-9aa9-f3fa87801887-kube-api-access-7s2lw\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340339 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340386 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qr2w\" (UniqueName: \"kubernetes.io/projected/e7c62b6d-4e36-487e-908a-da0e5748a9bb-kube-api-access-6qr2w\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340468 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d75c03-610d-490c-a8f3-f98ca377ccda-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340491 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sncqj\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6f562a0-0eb8-422c-9753-b637b745e579-metrics-tls\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340535 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnhm\" (UniqueName: \"kubernetes.io/projected/ad2d8057-b6d0-48fe-b95d-3126a4631754-kube-api-access-zqnhm\") pod \"migrator-59844c95c7-2mz86\" (UID: \"ad2d8057-b6d0-48fe-b95d-3126a4631754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340569 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98621832-7945-48e2-9aa9-f3fa87801887-proxy-tls\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvz5\" (UniqueName: \"kubernetes.io/projected/f6f562a0-0eb8-422c-9753-b637b745e579-kube-api-access-5nvz5\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340614 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57bq\" (UniqueName: \"kubernetes.io/projected/ca46101e-e80b-47e2-bd0e-086ca89be1b1-kube-api-access-g57bq\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340678 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340704 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca46101e-e80b-47e2-bd0e-086ca89be1b1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340722 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340757 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340790 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca46101e-e80b-47e2-bd0e-086ca89be1b1-proxy-tls\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340923 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340965 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.340982 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-srv-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.341002 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.341030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d75c03-610d-490c-a8f3-f98ca377ccda-config\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.341046 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.341061 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d75c03-610d-490c-a8f3-f98ca377ccda-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.342176 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:01.842155463 +0000 UTC m=+39.011243885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.375909 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.378251 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.442701 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.443252 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:01.943222094 +0000 UTC m=+39.112310316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444570 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvz5\" (UniqueName: \"kubernetes.io/projected/f6f562a0-0eb8-422c-9753-b637b745e579-kube-api-access-5nvz5\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57bq\" (UniqueName: \"kubernetes.io/projected/ca46101e-e80b-47e2-bd0e-086ca89be1b1-kube-api-access-g57bq\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444729 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-socket-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444763 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.444795 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca46101e-e80b-47e2-bd0e-086ca89be1b1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445331 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-csi-data-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445469 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445511 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca46101e-e80b-47e2-bd0e-086ca89be1b1-proxy-tls\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445551 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-node-bootstrap-token\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445573 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445626 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-cert\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445672 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445706 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-srv-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445809 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445850 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d75c03-610d-490c-a8f3-f98ca377ccda-config\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445870 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-plugins-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445886 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bms9\" (UniqueName: \"kubernetes.io/projected/ec275d88-805e-40d9-8c06-0c86712f739c-kube-api-access-2bms9\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445938 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.445972 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d75c03-610d-490c-a8f3-f98ca377ccda-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.446000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-webhook-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.446028 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-images\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.446046 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lmq\" (UniqueName: \"kubernetes.io/projected/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-kube-api-access-m5lmq\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447179 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-metrics-tls\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447245 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-registration-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447274 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447303 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-mountpoint-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.447548 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65822bd2-50a3-41f3-b50d-ec86090ce4d4-tmpfs\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448734 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-config-volume\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448783 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptp5p\" (UniqueName: \"kubernetes.io/projected/65822bd2-50a3-41f3-b50d-ec86090ce4d4-kube-api-access-ptp5p\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448803 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndr8\" (UniqueName: \"kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448825 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-certs\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnkhd\" (UniqueName: \"kubernetes.io/projected/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-kube-api-access-dnkhd\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448887 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448905 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.448992 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s2lw\" (UniqueName: \"kubernetes.io/projected/98621832-7945-48e2-9aa9-f3fa87801887-kube-api-access-7s2lw\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.449020 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.449037 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvr6\" (UniqueName: \"kubernetes.io/projected/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-kube-api-access-fgvr6\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.449070 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psg42\" (UniqueName: \"kubernetes.io/projected/9742d031-8f05-438c-8028-700eb13042fe-kube-api-access-psg42\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450300 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qr2w\" (UniqueName: \"kubernetes.io/projected/e7c62b6d-4e36-487e-908a-da0e5748a9bb-kube-api-access-6qr2w\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450333 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d75c03-610d-490c-a8f3-f98ca377ccda-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450401 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sncqj\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450436 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6f562a0-0eb8-422c-9753-b637b745e579-metrics-tls\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450500 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnhm\" (UniqueName: \"kubernetes.io/projected/ad2d8057-b6d0-48fe-b95d-3126a4631754-kube-api-access-zqnhm\") pod \"migrator-59844c95c7-2mz86\" (UID: \"ad2d8057-b6d0-48fe-b95d-3126a4631754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.450534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98621832-7945-48e2-9aa9-f3fa87801887-proxy-tls\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.454261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.454261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65822bd2-50a3-41f3-b50d-ec86090ce4d4-tmpfs\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.454984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca46101e-e80b-47e2-bd0e-086ca89be1b1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.455541 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:01.955516982 +0000 UTC m=+39.124605214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.466670 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.469391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.473123 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98621832-7945-48e2-9aa9-f3fa87801887-images\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.477326 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.478041 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98621832-7945-48e2-9aa9-f3fa87801887-proxy-tls\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.483836 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d75c03-610d-490c-a8f3-f98ca377ccda-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.486070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.487582 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.488042 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d75c03-610d-490c-a8f3-f98ca377ccda-config\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.488196 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.488716 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.489616 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.497271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6f562a0-0eb8-422c-9753-b637b745e579-metrics-tls\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.497775 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c62b6d-4e36-487e-908a-da0e5748a9bb-srv-cert\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.497905 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65822bd2-50a3-41f3-b50d-ec86090ce4d4-webhook-cert\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.536478 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvz5\" (UniqueName: \"kubernetes.io/projected/f6f562a0-0eb8-422c-9753-b637b745e579-kube-api-access-5nvz5\") pod \"dns-operator-744455d44c-jc49r\" (UID: \"f6f562a0-0eb8-422c-9753-b637b745e579\") " pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.541987 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca46101e-e80b-47e2-bd0e-086ca89be1b1-proxy-tls\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.542467 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57bq\" (UniqueName: \"kubernetes.io/projected/ca46101e-e80b-47e2-bd0e-086ca89be1b1-kube-api-access-g57bq\") pod \"machine-config-controller-84d6567774-8qkfs\" (UID: \"ca46101e-e80b-47e2-bd0e-086ca89be1b1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.553704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.554128 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.054111593 +0000 UTC m=+39.223199825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.554537 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-socket-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.554632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-csi-data-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.554711 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-node-bootstrap-token\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.554808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.554905 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-cert\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-plugins-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555105 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bms9\" (UniqueName: \"kubernetes.io/projected/ec275d88-805e-40d9-8c06-0c86712f739c-kube-api-access-2bms9\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555215 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-metrics-tls\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555288 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-registration-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555416 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-mountpoint-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-certs\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555561 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-config-volume\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555639 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndr8\" (UniqueName: \"kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555703 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnkhd\" (UniqueName: \"kubernetes.io/projected/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-kube-api-access-dnkhd\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.555951 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.556980 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvr6\" (UniqueName: \"kubernetes.io/projected/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-kube-api-access-fgvr6\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.557063 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psg42\" (UniqueName: \"kubernetes.io/projected/9742d031-8f05-438c-8028-700eb13042fe-kube-api-access-psg42\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.557174 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.557888 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.558871 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-registration-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.558976 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-socket-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.559066 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-csi-data-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.559443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-plugins-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.559498 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.565150 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-cert\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.566580 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.566634 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9742d031-8f05-438c-8028-700eb13042fe-mountpoint-dir\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.567107 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.067074748 +0000 UTC m=+39.236162980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.567165 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-node-bootstrap-token\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.582679 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-metrics-tls\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.583811 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec275d88-805e-40d9-8c06-0c86712f739c-certs\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.585860 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-config-volume\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.598144 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.604941 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptp5p\" (UniqueName: \"kubernetes.io/projected/65822bd2-50a3-41f3-b50d-ec86090ce4d4-kube-api-access-ptp5p\") pod \"packageserver-d55dfcdfc-7qwwp\" (UID: \"65822bd2-50a3-41f3-b50d-ec86090ce4d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.612984 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fhpv8"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.632275 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qr2w\" (UniqueName: \"kubernetes.io/projected/e7c62b6d-4e36-487e-908a-da0e5748a9bb-kube-api-access-6qr2w\") pod \"olm-operator-6b444d44fb-85kpf\" (UID: \"e7c62b6d-4e36-487e-908a-da0e5748a9bb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.634543 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.635513 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d75c03-610d-490c-a8f3-f98ca377ccda-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2hlr\" (UID: \"04d75c03-610d-490c-a8f3-f98ca377ccda\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.640246 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.640463 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mqfwk"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.644662 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s2lw\" (UniqueName: \"kubernetes.io/projected/98621832-7945-48e2-9aa9-f3fa87801887-kube-api-access-7s2lw\") pod \"machine-config-operator-74547568cd-lpmxp\" (UID: \"98621832-7945-48e2-9aa9-f3fa87801887\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.644908 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.650920 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.659992 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.661330 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.161310052 +0000 UTC m=+39.330398284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.664252 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lmq\" (UniqueName: \"kubernetes.io/projected/464a7f33-924c-4f63-8c70-b9fac3a3a4c2-kube-api-access-m5lmq\") pod \"package-server-manager-789f6589d5-hj6gq\" (UID: \"464a7f33-924c-4f63-8c70-b9fac3a3a4c2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.666627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sncqj\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.672352 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.689565 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnhm\" (UniqueName: \"kubernetes.io/projected/ad2d8057-b6d0-48fe-b95d-3126a4631754-kube-api-access-zqnhm\") pod \"migrator-59844c95c7-2mz86\" (UID: \"ad2d8057-b6d0-48fe-b95d-3126a4631754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.729004 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bms9\" (UniqueName: \"kubernetes.io/projected/ec275d88-805e-40d9-8c06-0c86712f739c-kube-api-access-2bms9\") pod \"machine-config-server-8xm9f\" (UID: \"ec275d88-805e-40d9-8c06-0c86712f739c\") " pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.766218 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndr8\" (UniqueName: \"kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8\") pod \"cni-sysctl-allowlist-ds-zc447\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.766630 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.767113 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.267098517 +0000 UTC m=+39.436186749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.787423 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnkhd\" (UniqueName: \"kubernetes.io/projected/6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2-kube-api-access-dnkhd\") pod \"dns-default-h6w89\" (UID: \"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2\") " pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.854455 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psg42\" (UniqueName: \"kubernetes.io/projected/9742d031-8f05-438c-8028-700eb13042fe-kube-api-access-psg42\") pod \"csi-hostpathplugin-59xq2\" (UID: \"9742d031-8f05-438c-8028-700eb13042fe\") " pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.859445 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97"] Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.867450 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvr6\" (UniqueName: \"kubernetes.io/projected/1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0-kube-api-access-fgvr6\") pod \"ingress-canary-t9nfk\" (UID: \"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0\") " pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.870557 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.871125 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.370949205 +0000 UTC m=+39.540037437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.872192 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.875356 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.886659 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.899658 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" event={"ID":"71f66291-128a-4347-9c2b-e8a1f25aea67","Type":"ContainerStarted","Data":"bab2fcb11fca0e014e9ae492ef0b54f1347bcb3c7d3e553425069fd40da64a6d"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.899733 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" event={"ID":"71f66291-128a-4347-9c2b-e8a1f25aea67","Type":"ContainerStarted","Data":"b95484426d5ca34548f3820795f6ae965b1a2fc9bff796ba0d4a860d08cd2628"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.916422 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" event={"ID":"81e0ea07-506f-4560-90de-b5ae7675113f","Type":"ContainerStarted","Data":"226a829a0b251113b6b27c74879edba1eacfeded0b7002e4eecd4e7108d1b503"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.940720 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" event={"ID":"2e1a4023-11b5-47b3-886a-c37ecabc0103","Type":"ContainerStarted","Data":"dbab1aa414bc4efa5fedcf12bdfc8c3a7b6696f22086bc94a0e79c4f1b5b6f61"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.940989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" event={"ID":"2e1a4023-11b5-47b3-886a-c37ecabc0103","Type":"ContainerStarted","Data":"24b4dc6b2427b457a47db51ab69a03ae884f47bb050ecff284e4c57428435331"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.960613 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.972788 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.972947 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" event={"ID":"7411fb2b-cd62-452d-a5c8-94135752329d","Type":"ContainerStarted","Data":"7466022f45b5451b3c258c3ed2a3de433f4b8ca1dbd97b2593b821785d1c49a8"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.973000 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" event={"ID":"7411fb2b-cd62-452d-a5c8-94135752329d","Type":"ContainerStarted","Data":"9ec128d7f9e96315a175023c8b2045251d08545b0034ac7c55621564699a89fe"} Feb 18 05:49:01 crc kubenswrapper[4869]: E0218 05:49:01.975540 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.475524821 +0000 UTC m=+39.644613053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.984942 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tc859" event={"ID":"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2","Type":"ContainerStarted","Data":"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.985019 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tc859" event={"ID":"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2","Type":"ContainerStarted","Data":"2c90b0b6a107b8b2849e81696509f4d01f642eeedd2c6cec1152eabb918f5b40"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.988675 4869 generic.go:334] "Generic (PLEG): container finished" podID="33549d64-6034-4f60-b254-2729e899e541" containerID="cc8cc8f9d489847788ea43459bcc3add6091664d35248eeddd6f648a640bacbf" exitCode=0 Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.989162 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" event={"ID":"33549d64-6034-4f60-b254-2729e899e541","Type":"ContainerDied","Data":"cc8cc8f9d489847788ea43459bcc3add6091664d35248eeddd6f648a640bacbf"} Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.993028 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8xm9f" Feb 18 05:49:01 crc kubenswrapper[4869]: I0218 05:49:01.995923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" event={"ID":"41f2c352-c537-4c3b-b206-ef73eba45593","Type":"ContainerStarted","Data":"b46463353e2094a360e859cb49425649482e18793f1be55fa4a02fbc9fdc7164"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.001647 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" event={"ID":"c53da5b2-9e42-4160-8ed1-2600d9e76880","Type":"ContainerStarted","Data":"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.001710 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" event={"ID":"c53da5b2-9e42-4160-8ed1-2600d9e76880","Type":"ContainerStarted","Data":"ea9d1afc23b7243ed492a555e2c05fd770560b93a2a719d5c9c8693f8996d1a8"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.002431 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.003364 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t9nfk" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.004491 4869 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-b7522 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.004545 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.009394 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.015647 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" event={"ID":"b8d17264-7c40-4eb6-82eb-f4020e635dde","Type":"ContainerStarted","Data":"6596eb468fc46bbcc66abab2b975096bae4f088eaa9ee67f67343bcb8236db98"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.016668 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" event={"ID":"c2a55f3d-af8c-4589-a9b0-22b324a77524","Type":"ContainerStarted","Data":"730fc5027141b6ab5699a12b6c01eb524ac4908710857bae55663919e829ad2f"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.019619 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.026240 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mx9zm" event={"ID":"d13ea6d8-477e-4add-b9dc-f8cac9eb0b01","Type":"ContainerStarted","Data":"353cae19776402f4071ca28c1b97ea8b771d4287055e27fc3f11700f1384826b"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.027417 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.029415 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx9zm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.029461 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx9zm" podUID="d13ea6d8-477e-4add-b9dc-f8cac9eb0b01" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.031923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pg7gk" event={"ID":"61805635-7d39-4980-b1be-18cf8f05074d","Type":"ContainerStarted","Data":"22d887c9ac83ad40f34b63ddef81916195d177ab4c99d793589949b184bec6bd"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.032026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pg7gk" event={"ID":"61805635-7d39-4980-b1be-18cf8f05074d","Type":"ContainerStarted","Data":"aad0e8103bd3866d16005e76cb798050b82eaf8e668b1ad4f3973123c20940af"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.040382 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" event={"ID":"f43ce73f-2618-4c47-9304-7db1e67db22f","Type":"ContainerStarted","Data":"72c90b4a6f020796601ae457282e73a03fbee43352c0976f7f5bb2b971cf9778"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.040440 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" event={"ID":"f43ce73f-2618-4c47-9304-7db1e67db22f","Type":"ContainerStarted","Data":"b8b9a00ee86dc14383332e74c2fef1cdd4ebfd2717cd1092c8408fb7fcdf3219"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.041259 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.062354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" event={"ID":"5a9ba96f-26e6-4870-9e59-9735f210eef3","Type":"ContainerStarted","Data":"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.062417 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" event={"ID":"5a9ba96f-26e6-4870-9e59-9735f210eef3","Type":"ContainerStarted","Data":"f38b5769e49725124cebffcacd874108dc7086df671e943c95d4d09f43a2df8d"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.063541 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.069819 4869 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9m6gl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.069884 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.074395 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.079627 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.579595674 +0000 UTC m=+39.748683896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.079762 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" event={"ID":"f39f9dc9-3127-4302-93dd-faf8bb814b58","Type":"ContainerStarted","Data":"7d035e1b5c0fa23dbd931c70fadd19106335b1d8f1f48944e54b5911f262187c"} Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.088303 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.180433 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.185198 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.186110 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.686074936 +0000 UTC m=+39.855163168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.204882 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.213731 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.250757 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tzbsp"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.254267 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.257490 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.257565 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.262204 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tc859" podStartSLOduration=16.262191869 podStartE2EDuration="16.262191869s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.261805129 +0000 UTC m=+39.430893361" watchObservedRunningTime="2026-02-18 05:49:02.262191869 +0000 UTC m=+39.431280101" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.287582 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.787553087 +0000 UTC m=+39.956641319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.287458 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.288077 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.288624 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.788613012 +0000 UTC m=+39.957701254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.301369 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzw2f" podStartSLOduration=16.301338252 podStartE2EDuration="16.301338252s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.27905821 +0000 UTC m=+39.448146442" watchObservedRunningTime="2026-02-18 05:49:02.301338252 +0000 UTC m=+39.470426484" Feb 18 05:49:02 crc kubenswrapper[4869]: W0218 05:49:02.374756 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b505d9_7c7c_4ef4_a689_e2b80d855a09.slice/crio-38b2ea48715bae1bb9e519f3e19899f5c0e018f6f1ffb8adc0a343730ea80492 WatchSource:0}: Error finding container 38b2ea48715bae1bb9e519f3e19899f5c0e018f6f1ffb8adc0a343730ea80492: Status 404 returned error can't find the container with id 38b2ea48715bae1bb9e519f3e19899f5c0e018f6f1ffb8adc0a343730ea80492 Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.389047 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.389430 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:02.889409805 +0000 UTC m=+40.058498037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.442676 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2gwhz" podStartSLOduration=16.442653681 podStartE2EDuration="16.442653681s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.442417086 +0000 UTC m=+39.611505328" watchObservedRunningTime="2026-02-18 05:49:02.442653681 +0000 UTC m=+39.611741913" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.511569 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mx9zm" podStartSLOduration=16.511541939 podStartE2EDuration="16.511541939s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.470241043 +0000 UTC m=+39.639329275" watchObservedRunningTime="2026-02-18 05:49:02.511541939 +0000 UTC m=+39.680630171" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.513061 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.513607 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.013588818 +0000 UTC m=+40.182677050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.598430 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c5v7h" podStartSLOduration=16.598409564 podStartE2EDuration="16.598409564s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.593570395 +0000 UTC m=+39.762658627" watchObservedRunningTime="2026-02-18 05:49:02.598409564 +0000 UTC m=+39.767497796" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.614895 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.615153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.617571 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.117532548 +0000 UTC m=+40.286620780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.666711 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c4bcbdd-2490-4d47-b2b3-a2e832c63100-metrics-certs\") pod \"network-metrics-daemon-ckzlt\" (UID: \"1c4bcbdd-2490-4d47-b2b3-a2e832c63100\") " pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.712884 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pg7gk" podStartSLOduration=16.712854869 podStartE2EDuration="16.712854869s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:02.7108431 +0000 UTC m=+39.879931342" watchObservedRunningTime="2026-02-18 05:49:02.712854869 +0000 UTC m=+39.881943101" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.716639 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.717145 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.217123622 +0000 UTC m=+40.386211854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.816119 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckzlt" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.817996 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.818999 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.318964452 +0000 UTC m=+40.488052684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.921865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:02 crc kubenswrapper[4869]: E0218 05:49:02.926104 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.42608783 +0000 UTC m=+40.595176062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.928177 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg"] Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.941697 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.960385 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:02 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:02 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:02 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:02 crc kubenswrapper[4869]: I0218 05:49:02.960454 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:02 crc kubenswrapper[4869]: W0218 05:49:02.968363 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb635dd52_43a3_4285_a6f9_86b872db734a.slice/crio-da897d073cffb87204674811554bd20c32edc1617b1da07f381dae9eb79dad47 WatchSource:0}: Error finding container da897d073cffb87204674811554bd20c32edc1617b1da07f381dae9eb79dad47: Status 404 returned error can't find the container with id da897d073cffb87204674811554bd20c32edc1617b1da07f381dae9eb79dad47 Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.042137 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.042817 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.54279481 +0000 UTC m=+40.711883042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.088688 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cnzqw"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.095667 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.113910 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" event={"ID":"f39f9dc9-3127-4302-93dd-faf8bb814b58","Type":"ContainerStarted","Data":"f12a10087f2e3e4044f3668cdc647a2aebf2750537e14dc02b3c6efdf167196f"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.113997 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.118420 4869 patch_prober.go:28] interesting pod/console-operator-58897d9998-mqfwk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.118495 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" podUID="f39f9dc9-3127-4302-93dd-faf8bb814b58" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.132835 4869 generic.go:334] "Generic (PLEG): container finished" podID="f43ce73f-2618-4c47-9304-7db1e67db22f" containerID="72c90b4a6f020796601ae457282e73a03fbee43352c0976f7f5bb2b971cf9778" exitCode=0 Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.132947 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" event={"ID":"f43ce73f-2618-4c47-9304-7db1e67db22f","Type":"ContainerDied","Data":"72c90b4a6f020796601ae457282e73a03fbee43352c0976f7f5bb2b971cf9778"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.146506 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.147469 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" event={"ID":"311f547d-0650-427d-8a3d-31bfa0b56403","Type":"ContainerStarted","Data":"674d8d87f315c4868677c0c9332e3cf1e9a5f17c4f190feb66b2b5c9afec2855"} Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.148542 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.648521635 +0000 UTC m=+40.817610027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.169813 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bdj4n" podStartSLOduration=17.169790972 podStartE2EDuration="17.169790972s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:03.169399282 +0000 UTC m=+40.338487514" watchObservedRunningTime="2026-02-18 05:49:03.169790972 +0000 UTC m=+40.338879204" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.170148 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" event={"ID":"81e0ea07-506f-4560-90de-b5ae7675113f","Type":"ContainerStarted","Data":"46bb1481dda7dada61aaefd319ba067f62355fc4c03150c1a55bc21bfd31f4d0"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.176636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" event={"ID":"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf","Type":"ContainerStarted","Data":"d17df9aff2424f0a96ad9b0ebf96463bf8c7261677ba57f821a56caa3f292cf6"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.185412 4869 generic.go:334] "Generic (PLEG): container finished" podID="b8d17264-7c40-4eb6-82eb-f4020e635dde" containerID="cfcacec8a3e420d7c890743e8bf0d40e6194f23e0931311efa59fc8cce5284d4" exitCode=0 Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.185835 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" event={"ID":"b8d17264-7c40-4eb6-82eb-f4020e635dde","Type":"ContainerDied","Data":"cfcacec8a3e420d7c890743e8bf0d40e6194f23e0931311efa59fc8cce5284d4"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.209456 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" event={"ID":"16d0750f-52c1-450d-8f18-d068333e7bc3","Type":"ContainerStarted","Data":"e93ec32b3aacffded9149fcf8ac1761d22075d1d2102b3ba0e714ae2ac10707c"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.244016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" event={"ID":"f96a9f3e-b511-4f78-83fd-a134aa3d5106","Type":"ContainerStarted","Data":"ceea26c2ffbb12f3a1b4707ad64f8d8ee44a70af1d39d05f58a2475fc84cec38"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.244073 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" event={"ID":"f96a9f3e-b511-4f78-83fd-a134aa3d5106","Type":"ContainerStarted","Data":"7d3a0c1bc255ece3b001e6b288fdf4ac82a7642085771b9a3eea056873b16bd5"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.250250 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.250729 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.750703992 +0000 UTC m=+40.919792224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.250802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.252483 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.752466285 +0000 UTC m=+40.921554507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.274432 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" event={"ID":"84302a1e-f497-4332-8736-5e53c6741081","Type":"ContainerStarted","Data":"dc6a262cac6815605e26bd810c2f0157ccbcd94b7217cbe4da256faf214f2758"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.294222 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" event={"ID":"79b505d9-7c7c-4ef4-a689-e2b80d855a09","Type":"ContainerStarted","Data":"38b2ea48715bae1bb9e519f3e19899f5c0e018f6f1ffb8adc0a343730ea80492"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.297331 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" event={"ID":"d3b34965-6f66-4f22-b655-05ae645a8f19","Type":"ContainerStarted","Data":"ff900fbabbc110ae2df0d64d74eaf0cc6b6721164010c98ed64e40f0ad546b18"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.318036 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" podStartSLOduration=17.31801239 podStartE2EDuration="17.31801239s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:03.286351799 +0000 UTC m=+40.455440031" watchObservedRunningTime="2026-02-18 05:49:03.31801239 +0000 UTC m=+40.487100632" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.318918 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" event={"ID":"c2a55f3d-af8c-4589-a9b0-22b324a77524","Type":"ContainerStarted","Data":"ef3e125f182ca060538938a41b927921ef7bc3ec7c1b4079619bc64a44bf3c7c"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.348663 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.350023 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" event={"ID":"873e5900-9845-4229-9d30-8b59c34f86fc","Type":"ContainerStarted","Data":"fd865cd40c744ddfd22ad0b84037c0db21f48e91acd4a030906fda3bad57fbcb"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.352669 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.353905 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.853867692 +0000 UTC m=+41.022955924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.387568 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8xm9f" event={"ID":"ec275d88-805e-40d9-8c06-0c86712f739c","Type":"ContainerStarted","Data":"a86a7f74745e1a1ede0a000e4094fcb6816d8865cb8401fd8e7cadf3714f5b8b"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.408406 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.416439 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jc49r"] Feb 18 05:49:03 crc kubenswrapper[4869]: W0218 05:49:03.454953 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca46101e_e80b_47e2_bd0e_086ca89be1b1.slice/crio-f1c34e76dbf9f358ab63574bbff7699ff80a503fa83cdd7c6bb1cf19ab3431bb WatchSource:0}: Error finding container f1c34e76dbf9f358ab63574bbff7699ff80a503fa83cdd7c6bb1cf19ab3431bb: Status 404 returned error can't find the container with id f1c34e76dbf9f358ab63574bbff7699ff80a503fa83cdd7c6bb1cf19ab3431bb Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.455415 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerStarted","Data":"62fa0171431dc588dc09904620132917d479adfcbe5e5987bc671504556b7cb0"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.475089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.475693 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:03.975675197 +0000 UTC m=+41.144763429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.534273 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" event={"ID":"b635dd52-43a3-4285-a6f9-86b872db734a","Type":"ContainerStarted","Data":"da897d073cffb87204674811554bd20c32edc1617b1da07f381dae9eb79dad47"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.535075 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.536862 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.538173 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" event={"ID":"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9","Type":"ContainerStarted","Data":"61fa09f7eb6215bd32f2ba60e0cc6691ec0ce1d5edb9bd82d068e1c370b9266d"} Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.543136 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx9zm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.545586 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx9zm" podUID="d13ea6d8-477e-4add-b9dc-f8cac9eb0b01" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.549275 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.553294 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lljlj" podStartSLOduration=17.553238966 podStartE2EDuration="17.553238966s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:03.527622712 +0000 UTC m=+40.696710944" watchObservedRunningTime="2026-02-18 05:49:03.553238966 +0000 UTC m=+40.722327198" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.558701 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.560367 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.623496 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.629006 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.1289872 +0000 UTC m=+41.298075432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.643172 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wgv88" podStartSLOduration=17.643140664 podStartE2EDuration="17.643140664s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:03.566390276 +0000 UTC m=+40.735478508" watchObservedRunningTime="2026-02-18 05:49:03.643140664 +0000 UTC m=+40.812228896" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.651155 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h6w89"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.658828 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.671507 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t9nfk"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.689300 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckzlt"] Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.691293 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" podStartSLOduration=16.691271876 podStartE2EDuration="16.691271876s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:03.67133026 +0000 UTC m=+40.840418492" watchObservedRunningTime="2026-02-18 05:49:03.691271876 +0000 UTC m=+40.860360108" Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.697944 4869 csr.go:261] certificate signing request csr-qlz9w is approved, waiting to be issued Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.706422 4869 csr.go:257] certificate signing request csr-qlz9w is issued Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.730529 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.730895 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.230882141 +0000 UTC m=+41.399970373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.741842 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59xq2"] Feb 18 05:49:03 crc kubenswrapper[4869]: W0218 05:49:03.808324 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d2a84e5_581e_4dc3_93fe_b515f3b0f0e2.slice/crio-7de8ded3486c8124f1b20b1178c827bf489c63fe1a54d29813e8d9c06335dbc3 WatchSource:0}: Error finding container 7de8ded3486c8124f1b20b1178c827bf489c63fe1a54d29813e8d9c06335dbc3: Status 404 returned error can't find the container with id 7de8ded3486c8124f1b20b1178c827bf489c63fe1a54d29813e8d9c06335dbc3 Feb 18 05:49:03 crc kubenswrapper[4869]: W0218 05:49:03.828122 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9742d031_8f05_438c_8028_700eb13042fe.slice/crio-5b0020a9924564bbeedc8fb88217adc8553ea8c4886d914b577dc58709b2560f WatchSource:0}: Error finding container 5b0020a9924564bbeedc8fb88217adc8553ea8c4886d914b577dc58709b2560f: Status 404 returned error can't find the container with id 5b0020a9924564bbeedc8fb88217adc8553ea8c4886d914b577dc58709b2560f Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.836787 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.837110 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.337095595 +0000 UTC m=+41.506183827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.837376 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.837924 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.337910555 +0000 UTC m=+41.506998787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: W0218 05:49:03.864311 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4bcbdd_2490_4d47_b2b3_a2e832c63100.slice/crio-bf8827da32a2de65ba6e344060dc279aac53b2ca53dcd349e89a1cc497e1354f WatchSource:0}: Error finding container bf8827da32a2de65ba6e344060dc279aac53b2ca53dcd349e89a1cc497e1354f: Status 404 returned error can't find the container with id bf8827da32a2de65ba6e344060dc279aac53b2ca53dcd349e89a1cc497e1354f Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.953518 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:03 crc kubenswrapper[4869]: E0218 05:49:03.953974 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.45395671 +0000 UTC m=+41.623044942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.961046 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:03 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:03 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:03 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:03 crc kubenswrapper[4869]: I0218 05:49:03.962695 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.046688 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" podStartSLOduration=18.046648166 podStartE2EDuration="18.046648166s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:04.008434736 +0000 UTC m=+41.177522978" watchObservedRunningTime="2026-02-18 05:49:04.046648166 +0000 UTC m=+41.215736398" Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.058977 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.059421 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.559407407 +0000 UTC m=+41.728495639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.097375 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fhpv8" podStartSLOduration=18.09733762 podStartE2EDuration="18.09733762s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:04.047652591 +0000 UTC m=+41.216740823" watchObservedRunningTime="2026-02-18 05:49:04.09733762 +0000 UTC m=+41.266425852" Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.164591 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.165000 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.664976537 +0000 UTC m=+41.834064769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.165202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.165609 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.665593752 +0000 UTC m=+41.834681984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.267511 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.267925 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.767904522 +0000 UTC m=+41.936992754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.281802 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" podStartSLOduration=18.28177549 podStartE2EDuration="18.28177549s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:04.27112187 +0000 UTC m=+41.440210102" watchObservedRunningTime="2026-02-18 05:49:04.28177549 +0000 UTC m=+41.450863722" Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.374369 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" podStartSLOduration=18.374352033 podStartE2EDuration="18.374352033s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:04.372494888 +0000 UTC m=+41.541583120" watchObservedRunningTime="2026-02-18 05:49:04.374352033 +0000 UTC m=+41.543440255" Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.377098 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.377577 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.877562841 +0000 UTC m=+42.046651073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.477949 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.478476 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:04.978454397 +0000 UTC m=+42.147542629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.579384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.580070 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.080052011 +0000 UTC m=+42.249140243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.593913 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" event={"ID":"ca46101e-e80b-47e2-bd0e-086ca89be1b1","Type":"ContainerStarted","Data":"bea20b4461fb8a1c28e1035fdc480cfac77c5bc752c4f56236d307d2c656959e"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.593969 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" event={"ID":"ca46101e-e80b-47e2-bd0e-086ca89be1b1","Type":"ContainerStarted","Data":"f1c34e76dbf9f358ab63574bbff7699ff80a503fa83cdd7c6bb1cf19ab3431bb"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.652008 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" event={"ID":"c2a55f3d-af8c-4589-a9b0-22b324a77524","Type":"ContainerStarted","Data":"988eeda757b73316327da0462b673d43d323c1bb741a6eb5b89a77810f8ea1c4"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.682107 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.682854 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.182820002 +0000 UTC m=+42.351908234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.697912 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckzlt" event={"ID":"1c4bcbdd-2490-4d47-b2b3-a2e832c63100","Type":"ContainerStarted","Data":"bf8827da32a2de65ba6e344060dc279aac53b2ca53dcd349e89a1cc497e1354f"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.712201 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-18 05:44:03 +0000 UTC, rotation deadline is 2026-11-17 19:12:18.171596765 +0000 UTC Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.712232 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6541h23m13.459366947s for next certificate rotation Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.716896 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" event={"ID":"65822bd2-50a3-41f3-b50d-ec86090ce4d4","Type":"ContainerStarted","Data":"5d46bdca66bb62ede951b570aa3abd55c8d8cc9f7bd0f2121b8dd3d1901dc22e"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.795864 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.796826 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.296808987 +0000 UTC m=+42.465897219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.910434 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:04 crc kubenswrapper[4869]: E0218 05:49:04.910948 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.410911734 +0000 UTC m=+42.579999966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.920794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" event={"ID":"d3b34965-6f66-4f22-b655-05ae645a8f19","Type":"ContainerStarted","Data":"c0e70b84354df98b9eca2c0f464ddd9e59560298e60395e7334c4a2806300971"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.981901 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" event={"ID":"ad2d8057-b6d0-48fe-b95d-3126a4631754","Type":"ContainerStarted","Data":"6fc83147e134f21ae156d3a56b46e001f7bf112e7ccac2c4b990d9f136d416d2"} Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.982256 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:04 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:04 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:04 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:04 crc kubenswrapper[4869]: I0218 05:49:04.982386 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.017673 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.020058 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.52003704 +0000 UTC m=+42.689125272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.050272 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2hvk6" podStartSLOduration=19.050249866 podStartE2EDuration="19.050249866s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.048167755 +0000 UTC m=+42.217255987" watchObservedRunningTime="2026-02-18 05:49:05.050249866 +0000 UTC m=+42.219338098" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.060152 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" event={"ID":"311f547d-0650-427d-8a3d-31bfa0b56403","Type":"ContainerStarted","Data":"bb4a000feb476c1554c59379efa805ab93491407c5aa814922571ded134ae256"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.074259 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" event={"ID":"16d0750f-52c1-450d-8f18-d068333e7bc3","Type":"ContainerStarted","Data":"662994c4be693e28e07e502a94c8be74a44beafb56b6939daafcc2dbf1295718"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.120066 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-twbsv" podStartSLOduration=19.120046155 podStartE2EDuration="19.120046155s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.118300782 +0000 UTC m=+42.287389014" watchObservedRunningTime="2026-02-18 05:49:05.120046155 +0000 UTC m=+42.289134387" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.120633 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.121136 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.62109044 +0000 UTC m=+42.790178712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.130934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" event={"ID":"873e5900-9845-4229-9d30-8b59c34f86fc","Type":"ContainerStarted","Data":"6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.132113 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.140808 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" event={"ID":"464a7f33-924c-4f63-8c70-b9fac3a3a4c2","Type":"ContainerStarted","Data":"75d79053a5d488a92a377480c90b5944235503b7fba885b62270be7772cb7861"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.172986 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" event={"ID":"98621832-7945-48e2-9aa9-f3fa87801887","Type":"ContainerStarted","Data":"e2dea2eedaaf3b67cf2eb2e353cb79ef70b6b317b3c03f9238ac3f2781ed1036"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.177434 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vpxmk" podStartSLOduration=19.177417072 podStartE2EDuration="19.177417072s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.175576887 +0000 UTC m=+42.344665119" watchObservedRunningTime="2026-02-18 05:49:05.177417072 +0000 UTC m=+42.346505304" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.223799 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" event={"ID":"9742d031-8f05-438c-8028-700eb13042fe","Type":"ContainerStarted","Data":"5b0020a9924564bbeedc8fb88217adc8553ea8c4886d914b577dc58709b2560f"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.224421 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.225441 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zkqc9" podStartSLOduration=19.22541849 podStartE2EDuration="19.22541849s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.222641433 +0000 UTC m=+42.391729665" watchObservedRunningTime="2026-02-18 05:49:05.22541849 +0000 UTC m=+42.394506712" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.225805 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.725789129 +0000 UTC m=+42.894877361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.282503 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t9nfk" event={"ID":"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0","Type":"ContainerStarted","Data":"56abc2c7347404ad47fb5f764fb2732948d588ec97e685d02df8fc1155de22aa"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.309278 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" event={"ID":"f43ce73f-2618-4c47-9304-7db1e67db22f","Type":"ContainerStarted","Data":"303690eab8614a0ea7c7836d802318a533e352c6aee41e09c906130f003a54f1"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.309982 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.312838 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" podStartSLOduration=7.312826778 podStartE2EDuration="7.312826778s" podCreationTimestamp="2026-02-18 05:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.272957687 +0000 UTC m=+42.442045919" watchObservedRunningTime="2026-02-18 05:49:05.312826778 +0000 UTC m=+42.481915010" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.319369 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.325419 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.325805 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.825787454 +0000 UTC m=+42.994875686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.345068 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t9nfk" podStartSLOduration=7.345048792 podStartE2EDuration="7.345048792s" podCreationTimestamp="2026-02-18 05:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.323418956 +0000 UTC m=+42.492507188" watchObservedRunningTime="2026-02-18 05:49:05.345048792 +0000 UTC m=+42.514137014" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.404840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" event={"ID":"67eed62e-4836-47bb-96d0-cd6668d64a78","Type":"ContainerStarted","Data":"7197d353d9571fe62c9914dac5b19bf56656da4bccb632dc752485b8efddb4bb"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.404904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" event={"ID":"67eed62e-4836-47bb-96d0-cd6668d64a78","Type":"ContainerStarted","Data":"8582e7fac362b230d2433f564d92741e88d91fbfdd3ef5775132d7896ebe9cac"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.427483 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.428131 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:05.928115315 +0000 UTC m=+43.097203547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.462791 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6w89" event={"ID":"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2","Type":"ContainerStarted","Data":"7de8ded3486c8124f1b20b1178c827bf489c63fe1a54d29813e8d9c06335dbc3"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.504816 4869 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-85kpf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.504888 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" podUID="e7c62b6d-4e36-487e-908a-da0e5748a9bb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.532623 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" event={"ID":"e7c62b6d-4e36-487e-908a-da0e5748a9bb","Type":"ContainerStarted","Data":"ac1cb293c95da2210f485dd2ec05839e94d59ecf2f234bb3dd88d841ba070a78"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.532701 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.532716 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" event={"ID":"e7c62b6d-4e36-487e-908a-da0e5748a9bb","Type":"ContainerStarted","Data":"2dec9cd6a3527c3896fd232ff7d745374ad99345b6f999f3dc91e4024d921708"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.533774 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.534421 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.034396612 +0000 UTC m=+43.203484844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.599360 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" podStartSLOduration=19.599338072 podStartE2EDuration="19.599338072s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.383890818 +0000 UTC m=+42.552979050" watchObservedRunningTime="2026-02-18 05:49:05.599338072 +0000 UTC m=+42.768426294" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.600001 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" podStartSLOduration=18.599995598 podStartE2EDuration="18.599995598s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.590306383 +0000 UTC m=+42.759394615" watchObservedRunningTime="2026-02-18 05:49:05.599995598 +0000 UTC m=+42.769083830" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.607700 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" event={"ID":"33549d64-6034-4f60-b254-2729e899e541","Type":"ContainerStarted","Data":"c2beceb462cfc9427f19f98cc9aae23829ae3f840b12b56fb95e79989c4532cc"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.643641 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.645129 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.145116517 +0000 UTC m=+43.314204749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.670058 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8xm9f" event={"ID":"ec275d88-805e-40d9-8c06-0c86712f739c","Type":"ContainerStarted","Data":"8ab06673f163e35abd1bfe59277dc8e513f2f740e8c4da6257dc4033e27b7cbd"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.686723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" event={"ID":"be5b136e-e664-4ed5-9fbd-e2a9bdd06db9","Type":"ContainerStarted","Data":"bc6d16fb366589e531132246573f3f409fff27089a88ad91434257e24ae5e0c2"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.713987 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" event={"ID":"f6f562a0-0eb8-422c-9753-b637b745e579","Type":"ContainerStarted","Data":"6bcce847d10b092d3535d0a2fd76b4e2a336d9915f62ae1555009612d1f27b95"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.745138 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8xm9f" podStartSLOduration=7.745117581 podStartE2EDuration="7.745117581s" podCreationTimestamp="2026-02-18 05:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.741364249 +0000 UTC m=+42.910452481" watchObservedRunningTime="2026-02-18 05:49:05.745117581 +0000 UTC m=+42.914205813" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.747289 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.751104 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.251079106 +0000 UTC m=+43.420167338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.760317 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" event={"ID":"f96a9f3e-b511-4f78-83fd-a134aa3d5106","Type":"ContainerStarted","Data":"08191a81c4fd514a4024ab25507760671caba0ee0fdb8261f7a3d8013f73083c"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.775818 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" event={"ID":"79b505d9-7c7c-4ef4-a689-e2b80d855a09","Type":"ContainerStarted","Data":"f3a624c62d9944f69426f736bc9856ea5387fccab27994a1d2959667f345edb9"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.776450 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.777587 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" event={"ID":"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf","Type":"ContainerStarted","Data":"e7ba3a4c09402694d3eb14f70c7a6a424db9ddc92e4597ffdcc385fa50203789"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.797175 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.801069 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" event={"ID":"04d75c03-610d-490c-a8f3-f98ca377ccda","Type":"ContainerStarted","Data":"df8b0381d042e46aa0aebc7743d5062608b7ea7d61a7b3104eccc8440c5c8f42"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.813870 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" event={"ID":"b635dd52-43a3-4285-a6f9-86b872db734a","Type":"ContainerStarted","Data":"dc273feee7a062cafb7b43ae9a00b5e2ee705444635680cc7f17a6954cc89206"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.827887 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pmqtp" podStartSLOduration=19.827863865 podStartE2EDuration="19.827863865s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.826125773 +0000 UTC m=+42.995214005" watchObservedRunningTime="2026-02-18 05:49:05.827863865 +0000 UTC m=+42.996952097" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.850375 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.859436 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.359414153 +0000 UTC m=+43.528502385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.873493 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerStarted","Data":"95da8d9bc2115bb8279152a8d9f1d2cb19e38f7fb924557c084d8af071fe7c8f"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.874538 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.909143 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" event={"ID":"84302a1e-f497-4332-8736-5e53c6741081","Type":"ContainerStarted","Data":"3597e28c420231ddd68465b399a83a54999f58410fe431830c50b8cb0b2e08e9"} Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.918052 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zd46x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.918122 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.945196 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mqfwk" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.951546 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.951901 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.451869204 +0000 UTC m=+43.620957436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.952065 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:05 crc kubenswrapper[4869]: E0218 05:49:05.953869 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.453859752 +0000 UTC m=+43.622947984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.961363 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p8b4x" podStartSLOduration=19.961343605 podStartE2EDuration="19.961343605s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.898532726 +0000 UTC m=+43.067620958" watchObservedRunningTime="2026-02-18 05:49:05.961343605 +0000 UTC m=+43.130431837" Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.969779 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:05 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:05 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:05 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:05 crc kubenswrapper[4869]: I0218 05:49:05.969832 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.056153 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.056398 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.556355707 +0000 UTC m=+43.725443979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.057230 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.057639 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.557629418 +0000 UTC m=+43.726717840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.079401 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zmmhg" podStartSLOduration=19.079372467 podStartE2EDuration="19.079372467s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:05.9701903 +0000 UTC m=+43.139278532" watchObservedRunningTime="2026-02-18 05:49:06.079372467 +0000 UTC m=+43.248460699" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.080018 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" podStartSLOduration=20.080013583 podStartE2EDuration="20.080013583s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:06.078291891 +0000 UTC m=+43.247380123" watchObservedRunningTime="2026-02-18 05:49:06.080013583 +0000 UTC m=+43.249101815" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.159160 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.159548 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.659528038 +0000 UTC m=+43.828616270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.199567 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5d72" podStartSLOduration=19.199542313 podStartE2EDuration="19.199542313s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:06.11561026 +0000 UTC m=+43.284698492" watchObservedRunningTime="2026-02-18 05:49:06.199542313 +0000 UTC m=+43.368630545" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.202385 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" podStartSLOduration=20.202379792 podStartE2EDuration="20.202379792s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:06.184502466 +0000 UTC m=+43.353590688" watchObservedRunningTime="2026-02-18 05:49:06.202379792 +0000 UTC m=+43.371468024" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.227611 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tzbsp" podStartSLOduration=19.227574985 podStartE2EDuration="19.227574985s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:06.226999971 +0000 UTC m=+43.396088203" watchObservedRunningTime="2026-02-18 05:49:06.227574985 +0000 UTC m=+43.396663217" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.248953 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zc447"] Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.260463 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.260850 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.760838054 +0000 UTC m=+43.929926286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.361782 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.362225 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.862204852 +0000 UTC m=+44.031293084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.463224 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.463826 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:06.963788214 +0000 UTC m=+44.132876446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.476688 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fbl5d" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.570405 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.570530 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.070509593 +0000 UTC m=+44.239597825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.570926 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.571256 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.07125028 +0000 UTC m=+44.240338512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.675602 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.676449 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.176425391 +0000 UTC m=+44.345513623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.778007 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.778499 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.278477355 +0000 UTC m=+44.447565587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.810637 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.811734 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.814912 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.830816 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.879017 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.879246 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.379216207 +0000 UTC m=+44.548304439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.879359 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.879426 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.879480 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k692c\" (UniqueName: \"kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.879687 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.379679508 +0000 UTC m=+44.548767740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.879688 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.918634 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" event={"ID":"65822bd2-50a3-41f3-b50d-ec86090ce4d4","Type":"ContainerStarted","Data":"e243550e0af4d71355ab9160af29f2e9e9e37024aead096ddd7da2e85ad20759"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.919250 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.924294 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2hlr" event={"ID":"04d75c03-610d-490c-a8f3-f98ca377ccda","Type":"ContainerStarted","Data":"2fccb2426af0cce22c798f5cccbf8df39a24224ad80c572c699da40a113f1ae8"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.928773 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" event={"ID":"ca46101e-e80b-47e2-bd0e-086ca89be1b1","Type":"ContainerStarted","Data":"4b813dec3e85841d323cd96a46f8468563b216778f42764537a0d46a1b39aed7"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.933482 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" event={"ID":"98621832-7945-48e2-9aa9-f3fa87801887","Type":"ContainerStarted","Data":"40f1c22fd99586b07464f6523dc1b5c33b1bfba037ff5e54b75fc3fc0fa66065"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.933544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" event={"ID":"98621832-7945-48e2-9aa9-f3fa87801887","Type":"ContainerStarted","Data":"74f54c04746fcfede33c8c64b52c0bc10c60e01070529aa8d0c321ed1f0ca1ea"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.937488 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckzlt" event={"ID":"1c4bcbdd-2490-4d47-b2b3-a2e832c63100","Type":"ContainerStarted","Data":"92f2a083b63e1962e91cad815f231b04cadca298b02793531450e8a5d4a1e4b9"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.937523 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckzlt" event={"ID":"1c4bcbdd-2490-4d47-b2b3-a2e832c63100","Type":"ContainerStarted","Data":"7e5b1e02e47e8bc811756b65714a6943a0c7870284d16e83e7c4af19ad275cd0"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.940538 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6w89" event={"ID":"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2","Type":"ContainerStarted","Data":"293106eba8e0ac8ba1731d3d135289f4b41330b3bde12e502d37efbd5de91cfc"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.940580 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6w89" event={"ID":"6d2a84e5-581e-4dc3-93fe-b515f3b0f0e2","Type":"ContainerStarted","Data":"611e9bca9a4b792de5409185f011a239549fed4c6da7ec9c758794b835e4d070"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.941150 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.947124 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:06 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:06 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:06 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.947178 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.952450 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t9nfk" event={"ID":"1a6cedf4-7ab8-4763-8ccf-78c6a074b9c0","Type":"ContainerStarted","Data":"dc43d136d776e9cac87deed47609d55f303ded28f8550eafc5c1c44a7bfc33c6"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.960409 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" event={"ID":"ad2d8057-b6d0-48fe-b95d-3126a4631754","Type":"ContainerStarted","Data":"389d8024dd20ba5b9831f7eca792d5b02fb3ac0f92117765b2b4708785ba966c"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.960457 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" event={"ID":"ad2d8057-b6d0-48fe-b95d-3126a4631754","Type":"ContainerStarted","Data":"bf1b18d1caaf8e2b43bffe0d4a406842a340064d2e3420d1f39f8556f28a1378"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.962132 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" event={"ID":"9742d031-8f05-438c-8028-700eb13042fe","Type":"ContainerStarted","Data":"59d023c508b1c6a9da29c48bd95c3678179705494df44f24f8659f3b13293e15"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.963872 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" event={"ID":"464a7f33-924c-4f63-8c70-b9fac3a3a4c2","Type":"ContainerStarted","Data":"0abcc5f9b8feccc6bd9de56dcfbfa62c63decf5395bb232db7b978f72811a7b5"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.963899 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" event={"ID":"464a7f33-924c-4f63-8c70-b9fac3a3a4c2","Type":"ContainerStarted","Data":"ba51d4a96412a89c154479018f114ad4962e48b6583890470e07ed6be5298620"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.964337 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.970712 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" event={"ID":"b8d17264-7c40-4eb6-82eb-f4020e635dde","Type":"ContainerStarted","Data":"b3407e7e465512b6764dee50f06273a47c246a1ef021ab6ae0b7ebc3500639fe"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.973168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" event={"ID":"67eed62e-4836-47bb-96d0-cd6668d64a78","Type":"ContainerStarted","Data":"abe696730795f53f0ceab44be5f640f1127a4773a189c4c5e9bbe744e80aa0ba"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.978154 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" podStartSLOduration=19.978135435 podStartE2EDuration="19.978135435s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:06.976943076 +0000 UTC m=+44.146031308" watchObservedRunningTime="2026-02-18 05:49:06.978135435 +0000 UTC m=+44.147223667" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.978629 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" event={"ID":"f6f562a0-0eb8-422c-9753-b637b745e579","Type":"ContainerStarted","Data":"3b78bfeaa0b414b7095829e82d9c9aa3425aaeb93ba2c1409667f6c42176c8e3"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.978689 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" event={"ID":"f6f562a0-0eb8-422c-9753-b637b745e579","Type":"ContainerStarted","Data":"9384658039e6f87a6aead3e8279977bc3ef16716a244db15bcc304b190ada6d3"} Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.986427 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.986670 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k692c\" (UniqueName: \"kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.986788 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.986934 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: E0218 05:49:06.988000 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.487972745 +0000 UTC m=+44.657060977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.988536 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:06 crc kubenswrapper[4869]: I0218 05:49:06.989911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.000972 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" event={"ID":"33549d64-6034-4f60-b254-2729e899e541","Type":"ContainerStarted","Data":"4a8d78c5ed9c62dfbaa4f0d04f1492dbb1bc64d88937af999ce8ebec0a88dbb2"} Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.003141 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zd46x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.003226 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.014929 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lpmxp" podStartSLOduration=21.01490344 podStartE2EDuration="21.01490344s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.014290025 +0000 UTC m=+44.183378257" watchObservedRunningTime="2026-02-18 05:49:07.01490344 +0000 UTC m=+44.183991672" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.027115 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85kpf" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.035048 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k692c\" (UniqueName: \"kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c\") pod \"certified-operators-lfkdw\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.037212 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.038200 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.074951 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.083348 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.089721 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.089816 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.091773 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.59173439 +0000 UTC m=+44.760822622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.093065 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfq9\" (UniqueName: \"kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.093125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.131400 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.161842 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" podStartSLOduration=20.161816676 podStartE2EDuration="20.161816676s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.153177436 +0000 UTC m=+44.322265668" watchObservedRunningTime="2026-02-18 05:49:07.161816676 +0000 UTC m=+44.330904908" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.195464 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.196195 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfq9\" (UniqueName: \"kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.196300 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.196493 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.197106 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.197258 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.697239968 +0000 UTC m=+44.866328200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.197852 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.223893 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.233092 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cnzqw" podStartSLOduration=21.233063821000002 podStartE2EDuration="21.233063821s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.211357382 +0000 UTC m=+44.380445614" watchObservedRunningTime="2026-02-18 05:49:07.233063821 +0000 UTC m=+44.402152053" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.247081 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.247816 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.254866 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfq9\" (UniqueName: \"kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9\") pod \"community-operators-2q2rk\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.278273 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mz86" podStartSLOduration=21.27825062 podStartE2EDuration="21.27825062s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.276912168 +0000 UTC m=+44.446000420" watchObservedRunningTime="2026-02-18 05:49:07.27825062 +0000 UTC m=+44.447338852" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.298421 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.298481 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bww5k\" (UniqueName: \"kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.298687 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.298730 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.299118 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.799102948 +0000 UTC m=+44.968191170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.347463 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qkfs" podStartSLOduration=21.347441835 podStartE2EDuration="21.347441835s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.344261347 +0000 UTC m=+44.513349589" watchObservedRunningTime="2026-02-18 05:49:07.347441835 +0000 UTC m=+44.516530067" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.364141 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.402575 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.403372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.403437 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.403463 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bww5k\" (UniqueName: \"kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.404001 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:07.903984731 +0000 UTC m=+45.073072963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.404536 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.404843 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.465561 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" podStartSLOduration=20.465535839 podStartE2EDuration="20.465535839s" podCreationTimestamp="2026-02-18 05:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.44747799 +0000 UTC m=+44.616566222" watchObservedRunningTime="2026-02-18 05:49:07.465535839 +0000 UTC m=+44.634624071" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.467301 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bww5k\" (UniqueName: \"kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k\") pod \"certified-operators-g8mjl\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.500280 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.501348 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.505865 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.506196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.506648 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.00663324 +0000 UTC m=+45.175721472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.538912 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h6w89" podStartSLOduration=9.538890795 podStartE2EDuration="9.538890795s" podCreationTimestamp="2026-02-18 05:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.537181553 +0000 UTC m=+44.706269785" watchObservedRunningTime="2026-02-18 05:49:07.538890795 +0000 UTC m=+44.707979027" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.607279 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.607548 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8m7j\" (UniqueName: \"kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.607590 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.607628 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.607795 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.107775651 +0000 UTC m=+45.276863883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.612640 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ckzlt" podStartSLOduration=21.61263046 podStartE2EDuration="21.61263046s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.605354553 +0000 UTC m=+44.774442785" watchObservedRunningTime="2026-02-18 05:49:07.61263046 +0000 UTC m=+44.781718692" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.625089 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.712140 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.212126142 +0000 UTC m=+45.381214374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.712295 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.712372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8m7j\" (UniqueName: \"kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.712410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.712452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.712942 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.713446 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.807079 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8m7j\" (UniqueName: \"kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j\") pod \"community-operators-brm92\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.813630 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.814210 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.314192856 +0000 UTC m=+45.483281088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.870120 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.889888 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7qwwp" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.899082 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" podStartSLOduration=21.899062142 podStartE2EDuration="21.899062142s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.861929788 +0000 UTC m=+45.031018020" watchObservedRunningTime="2026-02-18 05:49:07.899062142 +0000 UTC m=+45.068150374" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.928044 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jc49r" podStartSLOduration=21.928026707 podStartE2EDuration="21.928026707s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:07.925127887 +0000 UTC m=+45.094216119" watchObservedRunningTime="2026-02-18 05:49:07.928026707 +0000 UTC m=+45.097114929" Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.929868 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:07 crc kubenswrapper[4869]: E0218 05:49:07.930377 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.430362774 +0000 UTC m=+45.599451006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.956909 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:07 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:07 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:07 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:07 crc kubenswrapper[4869]: I0218 05:49:07.957286 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.033552 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.034223 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.534200932 +0000 UTC m=+45.703289164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.059673 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" gracePeriod=30 Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.076889 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.101230 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.141000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.141363 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.641346561 +0000 UTC m=+45.810434803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.231368 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.241580 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.243082 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.743052636 +0000 UTC m=+45.912140868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.343884 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.344798 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.844783162 +0000 UTC m=+46.013871394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.356248 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.448390 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.448519 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.948498677 +0000 UTC m=+46.117586899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.448730 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.449119 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:08.949111911 +0000 UTC m=+46.118200143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.547494 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.553497 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.554026 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.054004815 +0000 UTC m=+46.223093047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.655953 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.656399 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.156379827 +0000 UTC m=+46.325468059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.745230 4869 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.757135 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.757295 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.257274183 +0000 UTC m=+46.426362415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.757851 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.758262 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.258251477 +0000 UTC m=+46.427339709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.783180 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.784327 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.787061 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.795076 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.863394 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.863563 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.363538589 +0000 UTC m=+46.532626821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.864146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.864210 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.864234 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.864254 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhl9s\" (UniqueName: \"kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.864719 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.364693798 +0000 UTC m=+46.533782030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.945778 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:08 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:08 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:08 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.945847 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.965579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.965779 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.965807 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhl9s\" (UniqueName: \"kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: E0218 05:49:08.965933 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.465891991 +0000 UTC m=+46.634980313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.966174 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.966424 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.966899 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:08 crc kubenswrapper[4869]: I0218 05:49:08.989672 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhl9s\" (UniqueName: \"kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s\") pod \"redhat-marketplace-gv5qc\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.055698 4869 generic.go:334] "Generic (PLEG): container finished" podID="5de90b28-d647-4947-b85a-9f65e908ac02" containerID="72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2" exitCode=0 Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.055773 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerDied","Data":"72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.055840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerStarted","Data":"eeaa3a4498459dafdbebff7b412237a0064a5053f26b164d33fa6eaac0371b9b"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.057516 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerID="302ff42c06139a942207345263eb7eb7bf5f94c64767e7314442cc6715e7062c" exitCode=0 Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.057591 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerDied","Data":"302ff42c06139a942207345263eb7eb7bf5f94c64767e7314442cc6715e7062c"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.057613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerStarted","Data":"894f49cc3a135fcbce848afbd9f95fa99017a4f61510c627b93ffa162abc422f"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.059650 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.065660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" event={"ID":"9742d031-8f05-438c-8028-700eb13042fe","Type":"ContainerStarted","Data":"bc02e4a471e75ac428a8ceb81309b14cc2c08ed2b049e1ed649a134bf98c327b"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.065713 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" event={"ID":"9742d031-8f05-438c-8028-700eb13042fe","Type":"ContainerStarted","Data":"8288f883d99782bc4ae9f631d236211d6a8b63d6d9bf96ad4e7fc01e0230f0ed"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.065735 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" event={"ID":"9742d031-8f05-438c-8028-700eb13042fe","Type":"ContainerStarted","Data":"451a2408c4a42c805e6dbd6da772c61bb08f9004db4f5cf7740acec067caaa5a"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.068104 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.068458 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.568443578 +0000 UTC m=+46.737531810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.073448 4869 generic.go:334] "Generic (PLEG): container finished" podID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerID="efe9e14c1822d9fedf3185cd7ce4ff3e8a94bf07d16835e5684c013d58efa762" exitCode=0 Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.073535 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerDied","Data":"efe9e14c1822d9fedf3185cd7ce4ff3e8a94bf07d16835e5684c013d58efa762"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.073576 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerStarted","Data":"d082db128eb59e3e029736eed6babda3794910c03f08794ec85ed791ea932299"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.076995 4869 generic.go:334] "Generic (PLEG): container finished" podID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerID="7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917" exitCode=0 Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.077901 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerDied","Data":"7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.077925 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerStarted","Data":"b2ab69e8b7ea84355d9e0455728ee1d790e1afe74bcb83051f7af1bfd0ac604f"} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.112263 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" podStartSLOduration=11.112242544 podStartE2EDuration="11.112242544s" podCreationTimestamp="2026-02-18 05:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:09.109399375 +0000 UTC m=+46.278487607" watchObservedRunningTime="2026-02-18 05:49:09.112242544 +0000 UTC m=+46.281330776" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.127144 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.170624 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.170881 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.670856551 +0000 UTC m=+46.839944773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.176385 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.180928 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.680905955 +0000 UTC m=+46.849994187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.199786 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.203840 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.216253 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.277216 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.277482 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.777445215 +0000 UTC m=+46.946533447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.277974 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.278010 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.278062 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.278082 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gzp\" (UniqueName: \"kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.278503 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.778480951 +0000 UTC m=+46.947569183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.378711 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.378873 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.878840873 +0000 UTC m=+47.047929105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.379151 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.379185 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.379255 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.379276 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gzp\" (UniqueName: \"kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.379694 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.379712 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.879690774 +0000 UTC m=+47.048779006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.380067 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.381574 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.405539 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gzp\" (UniqueName: \"kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp\") pod \"redhat-marketplace-5vpbh\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.479973 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.480387 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:49:09.980367935 +0000 UTC m=+47.149456157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.524177 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.581691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: E0218 05:49:09.582128 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:49:10.082108392 +0000 UTC m=+47.251196624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6rqpw" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.646276 4869 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-18T05:49:08.745271011Z","Handler":null,"Name":""} Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.653523 4869 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.653584 4869 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.683446 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.697793 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.751339 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.785432 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.796153 4869 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.796200 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.839612 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6rqpw\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.944993 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:09 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:09 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:09 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:09 crc kubenswrapper[4869]: I0218 05:49:09.945075 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.011997 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.013879 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.017212 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.017809 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.041584 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.094926 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrgg\" (UniqueName: \"kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.095542 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.095573 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.103121 4869 generic.go:334] "Generic (PLEG): container finished" podID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerID="9177c53668eb082dd00b3173b50e13223f1fec106e37d65b051ef8a45f74cbcb" exitCode=0 Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.103226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerDied","Data":"9177c53668eb082dd00b3173b50e13223f1fec106e37d65b051ef8a45f74cbcb"} Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.103318 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerStarted","Data":"bdea75f28fab35f4c5ae8afec9b13e330fbdae6b37b02951b8148ead82354d94"} Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.107265 4869 generic.go:334] "Generic (PLEG): container finished" podID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerID="bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37" exitCode=0 Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.108510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerDied","Data":"bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37"} Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.108541 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerStarted","Data":"4f3f8a26df68d05568af27c22a622dc304bc20728aa39f3d5a257e076d236c24"} Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.127053 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.127108 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.152451 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.197073 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.197145 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.197324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrgg\" (UniqueName: \"kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.198131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.198796 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.221706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrgg\" (UniqueName: \"kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg\") pod \"redhat-operators-97mgq\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.291131 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.291266 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.294026 4869 patch_prober.go:28] interesting pod/console-f9d7485db-tc859 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.294127 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tc859" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.302948 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx9zm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.303003 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mx9zm" podUID="d13ea6d8-477e-4add-b9dc-f8cac9eb0b01" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.302951 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx9zm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.303103 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx9zm" podUID="d13ea6d8-477e-4add-b9dc-f8cac9eb0b01" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.333995 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.334069 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.340203 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.350102 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.408642 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.410084 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.418374 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.455278 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.501131 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.501534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.501601 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5882\" (UniqueName: \"kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.603116 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.603260 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.603284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5882\" (UniqueName: \"kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.604411 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.604809 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.634844 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5882\" (UniqueName: \"kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882\") pod \"redhat-operators-bjkfs\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.730911 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.746008 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.939514 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.950370 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:10 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:10 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:10 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:10 crc kubenswrapper[4869]: I0218 05:49:10.950464 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.063603 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:11 crc kubenswrapper[4869]: W0218 05:49:11.074440 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode661f166_1e0a_481d_86d9_6d062411d9db.slice/crio-3c9e48409eb4dd793e34d319b2b7109584f3549e5c83d58c515ad0d1391e8e9b WatchSource:0}: Error finding container 3c9e48409eb4dd793e34d319b2b7109584f3549e5c83d58c515ad0d1391e8e9b: Status 404 returned error can't find the container with id 3c9e48409eb4dd793e34d319b2b7109584f3549e5c83d58c515ad0d1391e8e9b Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.120787 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" event={"ID":"7bd4427e-4327-477c-a527-de6c4bf89088","Type":"ContainerStarted","Data":"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.120889 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" event={"ID":"7bd4427e-4327-477c-a527-de6c4bf89088","Type":"ContainerStarted","Data":"01b66c0fb6f56292defad182a097338940db5c6684a023ba34afac5db0df3b05"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.121137 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.131818 4869 generic.go:334] "Generic (PLEG): container finished" podID="ac953f18-4fbf-455f-b229-a51977890aa6" containerID="c22247b6183532b3afbb76c0bc237896949d16653ffa4bb10af14ddce883118e" exitCode=0 Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.131926 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerDied","Data":"c22247b6183532b3afbb76c0bc237896949d16653ffa4bb10af14ddce883118e"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.132000 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerStarted","Data":"a9447bd55b8ea888f696ae29cb06c13df15bd719b255085f7c6854d5f8fc995a"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.137797 4869 generic.go:334] "Generic (PLEG): container finished" podID="c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" containerID="e7ba3a4c09402694d3eb14f70c7a6a424db9ddc92e4597ffdcc385fa50203789" exitCode=0 Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.137904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" event={"ID":"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf","Type":"ContainerDied","Data":"e7ba3a4c09402694d3eb14f70c7a6a424db9ddc92e4597ffdcc385fa50203789"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.147878 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerStarted","Data":"3c9e48409eb4dd793e34d319b2b7109584f3549e5c83d58c515ad0d1391e8e9b"} Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.148428 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" podStartSLOduration=25.148406278 podStartE2EDuration="25.148406278s" podCreationTimestamp="2026-02-18 05:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:11.143225203 +0000 UTC m=+48.312313455" watchObservedRunningTime="2026-02-18 05:49:11.148406278 +0000 UTC m=+48.317494510" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.153381 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bft8b" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.156761 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-stbsw" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.492802 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.945865 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:11 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:11 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:11 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:11 crc kubenswrapper[4869]: I0218 05:49:11.945945 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:12 crc kubenswrapper[4869]: E0218 05:49:12.027653 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:12 crc kubenswrapper[4869]: E0218 05:49:12.081085 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:12 crc kubenswrapper[4869]: E0218 05:49:12.083635 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:12 crc kubenswrapper[4869]: E0218 05:49:12.083758 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.174577 4869 generic.go:334] "Generic (PLEG): container finished" podID="e661f166-1e0a-481d-86d9-6d062411d9db" containerID="4f712046b1aed10017948a1b762bf0d966459745aa33143d658a0afc67fa26e8" exitCode=0 Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.174810 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerDied","Data":"4f712046b1aed10017948a1b762bf0d966459745aa33143d658a0afc67fa26e8"} Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.204858 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.614395 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.622483 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.642458 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.644134 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.653599 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.680902 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.726725 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.726785 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.733130 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.733106713 podStartE2EDuration="733.106713ms" podCreationTimestamp="2026-02-18 05:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:12.732778275 +0000 UTC m=+49.901866507" watchObservedRunningTime="2026-02-18 05:49:12.733106713 +0000 UTC m=+49.902194945" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.794216 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.827647 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.827966 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.828082 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.869561 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.929859 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume\") pod \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.929937 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") pod \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.930006 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnpfw\" (UniqueName: \"kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw\") pod \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\" (UID: \"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf\") " Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.933615 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" (UID: "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.937410 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw" (OuterVolumeSpecName: "kube-api-access-mnpfw") pod "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" (UID: "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf"). InnerVolumeSpecName "kube-api-access-mnpfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.944134 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:12 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:12 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:12 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.944218 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.944540 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" (UID: "c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:12 crc kubenswrapper[4869]: I0218 05:49:12.970795 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.032850 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnpfw\" (UniqueName: \"kubernetes.io/projected/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-kube-api-access-mnpfw\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.032911 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.032921 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.208338 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" event={"ID":"c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf","Type":"ContainerDied","Data":"d17df9aff2424f0a96ad9b0ebf96463bf8c7261677ba57f821a56caa3f292cf6"} Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.209278 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.209283 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d17df9aff2424f0a96ad9b0ebf96463bf8c7261677ba57f821a56caa3f292cf6" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.215912 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:49:13 crc kubenswrapper[4869]: W0218 05:49:13.218736 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podef4a3562_a825_459a_8b41_cf4f307561d1.slice/crio-be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078 WatchSource:0}: Error finding container be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078: Status 404 returned error can't find the container with id be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078 Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.402009 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.402238 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.513446 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.946309 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:13 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:13 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:13 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:13 crc kubenswrapper[4869]: I0218 05:49:13.946908 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.219300 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ef4a3562-a825-459a-8b41-cf4f307561d1","Type":"ContainerStarted","Data":"fe68a837689e459a3347ae68b3bb0723217e88a3213f99ff37c128a2f0848bab"} Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.219345 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ef4a3562-a825-459a-8b41-cf4f307561d1","Type":"ContainerStarted","Data":"be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078"} Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.234606 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.234587172 podStartE2EDuration="2.234587172s" podCreationTimestamp="2026-02-18 05:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:14.234252934 +0000 UTC m=+51.403341166" watchObservedRunningTime="2026-02-18 05:49:14.234587172 +0000 UTC m=+51.403675404" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.507750 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:49:14 crc kubenswrapper[4869]: E0218 05:49:14.508075 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" containerName="collect-profiles" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.508088 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" containerName="collect-profiles" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.508207 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" containerName="collect-profiles" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.508721 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.511465 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.511822 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.520094 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.571113 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.571203 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.672778 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.672843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.673070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.695754 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.833368 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.945656 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:14 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:14 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:14 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:14 crc kubenswrapper[4869]: I0218 05:49:14.946008 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.253966 4869 generic.go:334] "Generic (PLEG): container finished" podID="ef4a3562-a825-459a-8b41-cf4f307561d1" containerID="fe68a837689e459a3347ae68b3bb0723217e88a3213f99ff37c128a2f0848bab" exitCode=0 Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.254023 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ef4a3562-a825-459a-8b41-cf4f307561d1","Type":"ContainerDied","Data":"fe68a837689e459a3347ae68b3bb0723217e88a3213f99ff37c128a2f0848bab"} Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.294592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.294692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.294788 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.294817 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.295806 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.301639 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.314980 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.323638 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.395474 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.402471 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.414251 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.487332 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:49:15 crc kubenswrapper[4869]: W0218 05:49:15.527803 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbfb1e0ef_f39a_442b_997b_1d21dba07476.slice/crio-8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a WatchSource:0}: Error finding container 8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a: Status 404 returned error can't find the container with id 8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a Feb 18 05:49:15 crc kubenswrapper[4869]: W0218 05:49:15.843092 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6399f7784455427319de1abcc4268d59ae6bfadb7f6794ba3fa31ea820c3ca4c WatchSource:0}: Error finding container 6399f7784455427319de1abcc4268d59ae6bfadb7f6794ba3fa31ea820c3ca4c: Status 404 returned error can't find the container with id 6399f7784455427319de1abcc4268d59ae6bfadb7f6794ba3fa31ea820c3ca4c Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.944391 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:15 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:15 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:15 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:15 crc kubenswrapper[4869]: I0218 05:49:15.944491 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.341974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"137208882b188fd2f8805f8c57c5112db5594c9e247da476a8b8dacb384adc81"} Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.346243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5784fd8b71323bc35f62e61638222538af0d69bdb078276ce76d26bc4a127d6c"} Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.350073 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6399f7784455427319de1abcc4268d59ae6bfadb7f6794ba3fa31ea820c3ca4c"} Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.364569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bfb1e0ef-f39a-442b-997b-1d21dba07476","Type":"ContainerStarted","Data":"8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a"} Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.759585 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.834910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access\") pod \"ef4a3562-a825-459a-8b41-cf4f307561d1\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.835005 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir\") pod \"ef4a3562-a825-459a-8b41-cf4f307561d1\" (UID: \"ef4a3562-a825-459a-8b41-cf4f307561d1\") " Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.835321 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ef4a3562-a825-459a-8b41-cf4f307561d1" (UID: "ef4a3562-a825-459a-8b41-cf4f307561d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.843155 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ef4a3562-a825-459a-8b41-cf4f307561d1" (UID: "ef4a3562-a825-459a-8b41-cf4f307561d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.936876 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef4a3562-a825-459a-8b41-cf4f307561d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.936913 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ef4a3562-a825-459a-8b41-cf4f307561d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.945076 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:16 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:16 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:16 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:16 crc kubenswrapper[4869]: I0218 05:49:16.945503 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.014469 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h6w89" Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.374909 4869 generic.go:334] "Generic (PLEG): container finished" podID="bfb1e0ef-f39a-442b-997b-1d21dba07476" containerID="21f6aa3f26fc7b59ddf098ca6f42fc0d67606b10081a5c5a429e32242f69d618" exitCode=0 Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.375131 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bfb1e0ef-f39a-442b-997b-1d21dba07476","Type":"ContainerDied","Data":"21f6aa3f26fc7b59ddf098ca6f42fc0d67606b10081a5c5a429e32242f69d618"} Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.378124 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dd25527ae6abbd0f1e113e0cd60e101361a858a0741f9d21c718e267c41ca61c"} Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.378431 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.381723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4003031d70d0af80c86fccfc6580cdf26b75e193a1df86ef394773705c179f1c"} Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.402724 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"068dce418c5955d8e628f7ba3acd157571c554d420b57b345e8e882d5c529858"} Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.406798 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ef4a3562-a825-459a-8b41-cf4f307561d1","Type":"ContainerDied","Data":"be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078"} Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.406827 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be78dff86aab418a5272cc2a10be16bff0ac4884fec8ec6d166179dda50d0078" Feb 18 05:49:17 crc kubenswrapper[4869]: I0218 05:49:17.406864 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:49:18 crc kubenswrapper[4869]: I0218 05:49:18.438997 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:18 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:18 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:18 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:18 crc kubenswrapper[4869]: I0218 05:49:18.439064 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:18 crc kubenswrapper[4869]: E0218 05:49:18.494300 4869 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.007s" Feb 18 05:49:18 crc kubenswrapper[4869]: I0218 05:49:18.943472 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:18 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:18 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:18 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:18 crc kubenswrapper[4869]: I0218 05:49:18.943968 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:19 crc kubenswrapper[4869]: I0218 05:49:19.945932 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:19 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:19 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:19 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:19 crc kubenswrapper[4869]: I0218 05:49:19.946013 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:20 crc kubenswrapper[4869]: I0218 05:49:20.293502 4869 patch_prober.go:28] interesting pod/console-f9d7485db-tc859 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 18 05:49:20 crc kubenswrapper[4869]: I0218 05:49:20.293855 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tc859" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 18 05:49:20 crc kubenswrapper[4869]: I0218 05:49:20.309270 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mx9zm" Feb 18 05:49:20 crc kubenswrapper[4869]: I0218 05:49:20.942853 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:20 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:20 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:20 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:20 crc kubenswrapper[4869]: I0218 05:49:20.942937 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:21 crc kubenswrapper[4869]: I0218 05:49:21.941524 4869 patch_prober.go:28] interesting pod/router-default-5444994796-pg7gk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:49:21 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Feb 18 05:49:21 crc kubenswrapper[4869]: [+]process-running ok Feb 18 05:49:21 crc kubenswrapper[4869]: healthz check failed Feb 18 05:49:21 crc kubenswrapper[4869]: I0218 05:49:21.941582 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pg7gk" podUID="61805635-7d39-4980-b1be-18cf8f05074d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:49:22 crc kubenswrapper[4869]: E0218 05:49:22.026487 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:22 crc kubenswrapper[4869]: E0218 05:49:22.030138 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:22 crc kubenswrapper[4869]: E0218 05:49:22.036754 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:22 crc kubenswrapper[4869]: E0218 05:49:22.036795 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:22 crc kubenswrapper[4869]: I0218 05:49:22.941872 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:22 crc kubenswrapper[4869]: I0218 05:49:22.943980 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pg7gk" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.391009 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.535412 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access\") pod \"bfb1e0ef-f39a-442b-997b-1d21dba07476\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.535575 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir\") pod \"bfb1e0ef-f39a-442b-997b-1d21dba07476\" (UID: \"bfb1e0ef-f39a-442b-997b-1d21dba07476\") " Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.535690 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bfb1e0ef-f39a-442b-997b-1d21dba07476" (UID: "bfb1e0ef-f39a-442b-997b-1d21dba07476"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.535839 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb1e0ef-f39a-442b-997b-1d21dba07476-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.557751 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bfb1e0ef-f39a-442b-997b-1d21dba07476" (UID: "bfb1e0ef-f39a-442b-997b-1d21dba07476"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.558975 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bfb1e0ef-f39a-442b-997b-1d21dba07476","Type":"ContainerDied","Data":"8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a"} Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.559019 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8156fda4b03d3fb634f5021bce7e4a72cf591dc11195d3de8a826f8b23840a3a" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.559116 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:49:24 crc kubenswrapper[4869]: I0218 05:49:24.637277 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb1e0ef-f39a-442b-997b-1d21dba07476-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:30 crc kubenswrapper[4869]: I0218 05:49:30.047575 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:49:30 crc kubenswrapper[4869]: I0218 05:49:30.296807 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:30 crc kubenswrapper[4869]: I0218 05:49:30.300611 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:49:32 crc kubenswrapper[4869]: E0218 05:49:32.023655 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:32 crc kubenswrapper[4869]: E0218 05:49:32.025201 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:32 crc kubenswrapper[4869]: E0218 05:49:32.026311 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:49:32 crc kubenswrapper[4869]: E0218 05:49:32.026357 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:33 crc kubenswrapper[4869]: I0218 05:49:33.485865 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 18 05:49:38 crc kubenswrapper[4869]: I0218 05:49:38.640321 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zc447_873e5900-9845-4229-9d30-8b59c34f86fc/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:38 crc kubenswrapper[4869]: I0218 05:49:38.640898 4869 generic.go:334] "Generic (PLEG): container finished" podID="873e5900-9845-4229-9d30-8b59c34f86fc" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" exitCode=137 Feb 18 05:49:38 crc kubenswrapper[4869]: I0218 05:49:38.640941 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" event={"ID":"873e5900-9845-4229-9d30-8b59c34f86fc","Type":"ContainerDied","Data":"6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.008184 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zc447_873e5900-9845-4229-9d30-8b59c34f86fc/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.008521 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.022218 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.022494 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8m7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-brm92_openshift-marketplace(143f68ff-2d5f-435c-8a05-ebc0433fca48): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.025479 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-brm92" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.037897 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.03785749 podStartE2EDuration="7.03785749s" podCreationTimestamp="2026-02-18 05:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:40.035645405 +0000 UTC m=+77.204733637" watchObservedRunningTime="2026-02-18 05:49:40.03785749 +0000 UTC m=+77.206945732" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.043856 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.044030 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8gzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5vpbh_openshift-marketplace(a42348a5-962e-42b1-a0f5-67a89bd0532a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.045118 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5vpbh" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.050233 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.050470 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lbfq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2q2rk_openshift-marketplace(1fa8c341-03cc-49d4-8793-88cec4a8444d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.051752 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2q2rk" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.123613 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist\") pod \"873e5900-9845-4229-9d30-8b59c34f86fc\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.123717 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready\") pod \"873e5900-9845-4229-9d30-8b59c34f86fc\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.123788 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndr8\" (UniqueName: \"kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8\") pod \"873e5900-9845-4229-9d30-8b59c34f86fc\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.123839 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir\") pod \"873e5900-9845-4229-9d30-8b59c34f86fc\" (UID: \"873e5900-9845-4229-9d30-8b59c34f86fc\") " Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.124123 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "873e5900-9845-4229-9d30-8b59c34f86fc" (UID: "873e5900-9845-4229-9d30-8b59c34f86fc"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.124574 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready" (OuterVolumeSpecName: "ready") pod "873e5900-9845-4229-9d30-8b59c34f86fc" (UID: "873e5900-9845-4229-9d30-8b59c34f86fc"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.124862 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "873e5900-9845-4229-9d30-8b59c34f86fc" (UID: "873e5900-9845-4229-9d30-8b59c34f86fc"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.131273 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8" (OuterVolumeSpecName: "kube-api-access-6ndr8") pod "873e5900-9845-4229-9d30-8b59c34f86fc" (UID: "873e5900-9845-4229-9d30-8b59c34f86fc"). InnerVolumeSpecName "kube-api-access-6ndr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.225138 4869 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/873e5900-9845-4229-9d30-8b59c34f86fc-ready\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.225175 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndr8\" (UniqueName: \"kubernetes.io/projected/873e5900-9845-4229-9d30-8b59c34f86fc-kube-api-access-6ndr8\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.225190 4869 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/873e5900-9845-4229-9d30-8b59c34f86fc-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.225204 4869 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/873e5900-9845-4229-9d30-8b59c34f86fc-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.655544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerStarted","Data":"b04a039a266aa65388efee3e719f3d075106eec8f2ed018a73a292ce9d4b6d23"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.658237 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zc447_873e5900-9845-4229-9d30-8b59c34f86fc/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.658415 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" event={"ID":"873e5900-9845-4229-9d30-8b59c34f86fc","Type":"ContainerDied","Data":"fd865cd40c744ddfd22ad0b84037c0db21f48e91acd4a030906fda3bad57fbcb"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.658496 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zc447" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.658596 4869 scope.go:117] "RemoveContainer" containerID="6592c56fc0692e811ff75555bd961fe7caa7d8b8eae9f718fec9e50a184f7b08" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.669935 4869 generic.go:334] "Generic (PLEG): container finished" podID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerID="dad91b736972020acf87743d6ce207a6766cc797e2f7ef569401108ddbc8d742" exitCode=0 Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.670021 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerDied","Data":"dad91b736972020acf87743d6ce207a6766cc797e2f7ef569401108ddbc8d742"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.676171 4869 generic.go:334] "Generic (PLEG): container finished" podID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerID="9ed821e90ff3031a5e28548b5141d3a65994c73c6dc08852c06fb15628f1cbcb" exitCode=0 Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.676235 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerDied","Data":"9ed821e90ff3031a5e28548b5141d3a65994c73c6dc08852c06fb15628f1cbcb"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.684586 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerStarted","Data":"d5864f84fcebea715e70c0aab4c21c19ace693f106d0d3edd93278ea4e82a9c9"} Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.689963 4869 generic.go:334] "Generic (PLEG): container finished" podID="5de90b28-d647-4947-b85a-9f65e908ac02" containerID="074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e" exitCode=0 Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.690201 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerDied","Data":"074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e"} Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.697548 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-brm92" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.697726 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5vpbh" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" Feb 18 05:49:40 crc kubenswrapper[4869]: E0218 05:49:40.701414 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2q2rk" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.715027 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zc447"] Feb 18 05:49:40 crc kubenswrapper[4869]: I0218 05:49:40.722051 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zc447"] Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.478565 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" path="/var/lib/kubelet/pods/873e5900-9845-4229-9d30-8b59c34f86fc/volumes" Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.702107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerDied","Data":"d5864f84fcebea715e70c0aab4c21c19ace693f106d0d3edd93278ea4e82a9c9"} Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.702055 4869 generic.go:334] "Generic (PLEG): container finished" podID="e661f166-1e0a-481d-86d9-6d062411d9db" containerID="d5864f84fcebea715e70c0aab4c21c19ace693f106d0d3edd93278ea4e82a9c9" exitCode=0 Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.705627 4869 generic.go:334] "Generic (PLEG): container finished" podID="ac953f18-4fbf-455f-b229-a51977890aa6" containerID="b04a039a266aa65388efee3e719f3d075106eec8f2ed018a73a292ce9d4b6d23" exitCode=0 Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.705705 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerDied","Data":"b04a039a266aa65388efee3e719f3d075106eec8f2ed018a73a292ce9d4b6d23"} Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.721136 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerStarted","Data":"a53e6b34a781e705ae995355742cd7f11c11f3a95ce858c9c81d93f338478a22"} Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.726879 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerStarted","Data":"2399554798dd2d9194f369a5e7ae93b0538568ed170a25b7995287ab0b0874ec"} Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.786361 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gv5qc" podStartSLOduration=2.449208653 podStartE2EDuration="33.786332024s" podCreationTimestamp="2026-02-18 05:49:08 +0000 UTC" firstStartedPulling="2026-02-18 05:49:10.105788299 +0000 UTC m=+47.274876531" lastFinishedPulling="2026-02-18 05:49:41.44291167 +0000 UTC m=+78.611999902" observedRunningTime="2026-02-18 05:49:41.784861607 +0000 UTC m=+78.953949839" watchObservedRunningTime="2026-02-18 05:49:41.786332024 +0000 UTC m=+78.955420256" Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.789025 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lfkdw" podStartSLOduration=3.456928043 podStartE2EDuration="35.789016682s" podCreationTimestamp="2026-02-18 05:49:06 +0000 UTC" firstStartedPulling="2026-02-18 05:49:09.074903104 +0000 UTC m=+46.243991336" lastFinishedPulling="2026-02-18 05:49:41.406991743 +0000 UTC m=+78.576079975" observedRunningTime="2026-02-18 05:49:41.766832622 +0000 UTC m=+78.935920864" watchObservedRunningTime="2026-02-18 05:49:41.789016682 +0000 UTC m=+78.958104904" Feb 18 05:49:41 crc kubenswrapper[4869]: I0218 05:49:41.964871 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hj6gq" Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.734831 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerStarted","Data":"729154753d61043c911bed4e8683244342a4acfd0ef525f0a51bb2463518636f"} Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.738843 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerStarted","Data":"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1"} Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.746083 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerStarted","Data":"41884e1e8621c03377bc5b38aa152e12413bd778c0fe44f1f85aad5a28f65a51"} Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.762389 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bjkfs" podStartSLOduration=2.881150407 podStartE2EDuration="32.76237195s" podCreationTimestamp="2026-02-18 05:49:10 +0000 UTC" firstStartedPulling="2026-02-18 05:49:12.181480075 +0000 UTC m=+49.350568307" lastFinishedPulling="2026-02-18 05:49:42.062701608 +0000 UTC m=+79.231789850" observedRunningTime="2026-02-18 05:49:42.754561673 +0000 UTC m=+79.923649905" watchObservedRunningTime="2026-02-18 05:49:42.76237195 +0000 UTC m=+79.931460182" Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.771366 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8mjl" podStartSLOduration=3.14902747 podStartE2EDuration="35.771353577s" podCreationTimestamp="2026-02-18 05:49:07 +0000 UTC" firstStartedPulling="2026-02-18 05:49:09.059336356 +0000 UTC m=+46.228424588" lastFinishedPulling="2026-02-18 05:49:41.681662463 +0000 UTC m=+78.850750695" observedRunningTime="2026-02-18 05:49:42.769642664 +0000 UTC m=+79.938730906" watchObservedRunningTime="2026-02-18 05:49:42.771353577 +0000 UTC m=+79.940441809" Feb 18 05:49:42 crc kubenswrapper[4869]: I0218 05:49:42.793757 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97mgq" podStartSLOduration=3.851771022 podStartE2EDuration="33.793720461s" podCreationTimestamp="2026-02-18 05:49:09 +0000 UTC" firstStartedPulling="2026-02-18 05:49:12.186623571 +0000 UTC m=+49.355711803" lastFinishedPulling="2026-02-18 05:49:42.12857301 +0000 UTC m=+79.297661242" observedRunningTime="2026-02-18 05:49:42.790600963 +0000 UTC m=+79.959689195" watchObservedRunningTime="2026-02-18 05:49:42.793720461 +0000 UTC m=+79.962808693" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.133312 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.133894 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.290820 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.626765 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.626815 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.667136 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.809084 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:47 crc kubenswrapper[4869]: I0218 05:49:47.816394 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.482554 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:48 crc kubenswrapper[4869]: E0218 05:49:48.482772 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4a3562-a825-459a-8b41-cf4f307561d1" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.482783 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4a3562-a825-459a-8b41-cf4f307561d1" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: E0218 05:49:48.482795 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb1e0ef-f39a-442b-997b-1d21dba07476" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.482801 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb1e0ef-f39a-442b-997b-1d21dba07476" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: E0218 05:49:48.482810 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.482817 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.483002 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="873e5900-9845-4229-9d30-8b59c34f86fc" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.483024 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb1e0ef-f39a-442b-997b-1d21dba07476" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.483033 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4a3562-a825-459a-8b41-cf4f307561d1" containerName="pruner" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.483404 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.486978 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.487043 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.491441 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.541636 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.541798 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.587439 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.643592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.643675 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.643732 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.664632 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:48 crc kubenswrapper[4869]: I0218 05:49:48.809996 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.128395 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.128900 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.176247 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.215659 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:49 crc kubenswrapper[4869]: W0218 05:49:49.221058 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbb34f921_9588_451d_b936_1748fcdcde3c.slice/crio-a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b WatchSource:0}: Error finding container a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b: Status 404 returned error can't find the container with id a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.791467 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb34f921-9588-451d-b936-1748fcdcde3c","Type":"ContainerStarted","Data":"88fb2b6df2e735b67836935b619f7e8bdf47f7f8ff36a2c6dd59b07298141d51"} Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.791808 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb34f921-9588-451d-b936-1748fcdcde3c","Type":"ContainerStarted","Data":"a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b"} Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.791937 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g8mjl" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="registry-server" containerID="cri-o://b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1" gracePeriod=2 Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.832295 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:49:49 crc kubenswrapper[4869]: I0218 05:49:49.854932 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.8549112239999999 podStartE2EDuration="1.854911224s" podCreationTimestamp="2026-02-18 05:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:49.810167725 +0000 UTC m=+86.979255957" watchObservedRunningTime="2026-02-18 05:49:49.854911224 +0000 UTC m=+87.023999456" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.455691 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.456077 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.491089 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.724429 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.731540 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.731586 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.768398 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content\") pod \"5de90b28-d647-4947-b85a-9f65e908ac02\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.768478 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bww5k\" (UniqueName: \"kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k\") pod \"5de90b28-d647-4947-b85a-9f65e908ac02\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.768511 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities\") pod \"5de90b28-d647-4947-b85a-9f65e908ac02\" (UID: \"5de90b28-d647-4947-b85a-9f65e908ac02\") " Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.772424 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities" (OuterVolumeSpecName: "utilities") pod "5de90b28-d647-4947-b85a-9f65e908ac02" (UID: "5de90b28-d647-4947-b85a-9f65e908ac02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.776167 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k" (OuterVolumeSpecName: "kube-api-access-bww5k") pod "5de90b28-d647-4947-b85a-9f65e908ac02" (UID: "5de90b28-d647-4947-b85a-9f65e908ac02"). InnerVolumeSpecName "kube-api-access-bww5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.776949 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.811308 4869 generic.go:334] "Generic (PLEG): container finished" podID="5de90b28-d647-4947-b85a-9f65e908ac02" containerID="b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1" exitCode=0 Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.811588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerDied","Data":"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1"} Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.812554 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8mjl" event={"ID":"5de90b28-d647-4947-b85a-9f65e908ac02","Type":"ContainerDied","Data":"eeaa3a4498459dafdbebff7b412237a0064a5053f26b164d33fa6eaac0371b9b"} Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.812576 4869 scope.go:117] "RemoveContainer" containerID="b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.811642 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8mjl" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.815632 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb34f921-9588-451d-b936-1748fcdcde3c" containerID="88fb2b6df2e735b67836935b619f7e8bdf47f7f8ff36a2c6dd59b07298141d51" exitCode=0 Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.816033 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb34f921-9588-451d-b936-1748fcdcde3c","Type":"ContainerDied","Data":"88fb2b6df2e735b67836935b619f7e8bdf47f7f8ff36a2c6dd59b07298141d51"} Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.845278 4869 scope.go:117] "RemoveContainer" containerID="074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.850067 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.863515 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5de90b28-d647-4947-b85a-9f65e908ac02" (UID: "5de90b28-d647-4947-b85a-9f65e908ac02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.863893 4869 scope.go:117] "RemoveContainer" containerID="72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.869503 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.869539 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bww5k\" (UniqueName: \"kubernetes.io/projected/5de90b28-d647-4947-b85a-9f65e908ac02-kube-api-access-bww5k\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.869557 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de90b28-d647-4947-b85a-9f65e908ac02-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.875528 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.881849 4869 scope.go:117] "RemoveContainer" containerID="b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1" Feb 18 05:49:50 crc kubenswrapper[4869]: E0218 05:49:50.882305 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1\": container with ID starting with b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1 not found: ID does not exist" containerID="b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.882331 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1"} err="failed to get container status \"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1\": rpc error: code = NotFound desc = could not find container \"b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1\": container with ID starting with b9706720224d5570fb00ae51e7832d41cce3dd8cb0883a1ef0f86d0806395dd1 not found: ID does not exist" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.882363 4869 scope.go:117] "RemoveContainer" containerID="074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e" Feb 18 05:49:50 crc kubenswrapper[4869]: E0218 05:49:50.882655 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e\": container with ID starting with 074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e not found: ID does not exist" containerID="074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.882675 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e"} err="failed to get container status \"074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e\": rpc error: code = NotFound desc = could not find container \"074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e\": container with ID starting with 074b6b116609e411323ab63cb848c0d6d37d8804bfa4a3de688b129638babf9e not found: ID does not exist" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.882691 4869 scope.go:117] "RemoveContainer" containerID="72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2" Feb 18 05:49:50 crc kubenswrapper[4869]: E0218 05:49:50.883069 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2\": container with ID starting with 72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2 not found: ID does not exist" containerID="72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2" Feb 18 05:49:50 crc kubenswrapper[4869]: I0218 05:49:50.883084 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2"} err="failed to get container status \"72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2\": rpc error: code = NotFound desc = could not find container \"72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2\": container with ID starting with 72fe792617d1787cd028ad9fc3f59d3cbf7a2b074e097f9dec94edd6f7d83ca2 not found: ID does not exist" Feb 18 05:49:51 crc kubenswrapper[4869]: I0218 05:49:51.141645 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:51 crc kubenswrapper[4869]: I0218 05:49:51.147125 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g8mjl"] Feb 18 05:49:51 crc kubenswrapper[4869]: I0218 05:49:51.477694 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" path="/var/lib/kubelet/pods/5de90b28-d647-4947-b85a-9f65e908ac02/volumes" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.221186 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.282689 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access\") pod \"bb34f921-9588-451d-b936-1748fcdcde3c\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.283059 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir\") pod \"bb34f921-9588-451d-b936-1748fcdcde3c\" (UID: \"bb34f921-9588-451d-b936-1748fcdcde3c\") " Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.283199 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb34f921-9588-451d-b936-1748fcdcde3c" (UID: "bb34f921-9588-451d-b936-1748fcdcde3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.283342 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb34f921-9588-451d-b936-1748fcdcde3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.288465 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb34f921-9588-451d-b936-1748fcdcde3c" (UID: "bb34f921-9588-451d-b936-1748fcdcde3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.384007 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb34f921-9588-451d-b936-1748fcdcde3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.789576 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.828126 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb34f921-9588-451d-b936-1748fcdcde3c","Type":"ContainerDied","Data":"a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b"} Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.828168 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4fb53f0a32afc01bfdefc5373f2825234a9a433594e09b803112674cc7e970b" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.828138 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:52 crc kubenswrapper[4869]: I0218 05:49:52.828290 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bjkfs" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="registry-server" containerID="cri-o://729154753d61043c911bed4e8683244342a4acfd0ef525f0a51bb2463518636f" gracePeriod=2 Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.678968 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:53 crc kubenswrapper[4869]: E0218 05:49:53.679220 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="extract-content" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679235 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="extract-content" Feb 18 05:49:53 crc kubenswrapper[4869]: E0218 05:49:53.679244 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="registry-server" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679250 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="registry-server" Feb 18 05:49:53 crc kubenswrapper[4869]: E0218 05:49:53.679265 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="extract-utilities" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679275 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="extract-utilities" Feb 18 05:49:53 crc kubenswrapper[4869]: E0218 05:49:53.679290 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb34f921-9588-451d-b936-1748fcdcde3c" containerName="pruner" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679297 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb34f921-9588-451d-b936-1748fcdcde3c" containerName="pruner" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679450 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb34f921-9588-451d-b936-1748fcdcde3c" containerName="pruner" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679469 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de90b28-d647-4947-b85a-9f65e908ac02" containerName="registry-server" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.679908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.681858 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.683792 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.727599 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.799143 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.799187 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.799217 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.840785 4869 generic.go:334] "Generic (PLEG): container finished" podID="e661f166-1e0a-481d-86d9-6d062411d9db" containerID="729154753d61043c911bed4e8683244342a4acfd0ef525f0a51bb2463518636f" exitCode=0 Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.840855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerDied","Data":"729154753d61043c911bed4e8683244342a4acfd0ef525f0a51bb2463518636f"} Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.900826 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.900893 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.900979 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.901086 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.901451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.920149 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.965087 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:53 crc kubenswrapper[4869]: I0218 05:49:53.994521 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.002172 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5882\" (UniqueName: \"kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882\") pod \"e661f166-1e0a-481d-86d9-6d062411d9db\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.002234 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content\") pod \"e661f166-1e0a-481d-86d9-6d062411d9db\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.002307 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities\") pod \"e661f166-1e0a-481d-86d9-6d062411d9db\" (UID: \"e661f166-1e0a-481d-86d9-6d062411d9db\") " Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.003665 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities" (OuterVolumeSpecName: "utilities") pod "e661f166-1e0a-481d-86d9-6d062411d9db" (UID: "e661f166-1e0a-481d-86d9-6d062411d9db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.010012 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882" (OuterVolumeSpecName: "kube-api-access-x5882") pod "e661f166-1e0a-481d-86d9-6d062411d9db" (UID: "e661f166-1e0a-481d-86d9-6d062411d9db"). InnerVolumeSpecName "kube-api-access-x5882". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.104253 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5882\" (UniqueName: \"kubernetes.io/projected/e661f166-1e0a-481d-86d9-6d062411d9db-kube-api-access-x5882\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.104279 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.149397 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e661f166-1e0a-481d-86d9-6d062411d9db" (UID: "e661f166-1e0a-481d-86d9-6d062411d9db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.206455 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e661f166-1e0a-481d-86d9-6d062411d9db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.207184 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:54 crc kubenswrapper[4869]: W0218 05:49:54.218632 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3dab06c_6d1a_4665_a4a6_7549071c8b13.slice/crio-afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe WatchSource:0}: Error finding container afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe: Status 404 returned error can't find the container with id afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.847099 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3dab06c-6d1a-4665-a4a6-7549071c8b13","Type":"ContainerStarted","Data":"d572ed22414e1f6700dffa5a2a21451daf5ef58a6fb4eda887b77db4e14aa617"} Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.847459 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3dab06c-6d1a-4665-a4a6-7549071c8b13","Type":"ContainerStarted","Data":"afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe"} Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.850464 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjkfs" event={"ID":"e661f166-1e0a-481d-86d9-6d062411d9db","Type":"ContainerDied","Data":"3c9e48409eb4dd793e34d319b2b7109584f3549e5c83d58c515ad0d1391e8e9b"} Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.850497 4869 scope.go:117] "RemoveContainer" containerID="729154753d61043c911bed4e8683244342a4acfd0ef525f0a51bb2463518636f" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.850582 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjkfs" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.870838 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.870819035 podStartE2EDuration="1.870819035s" podCreationTimestamp="2026-02-18 05:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:54.866893297 +0000 UTC m=+92.035981539" watchObservedRunningTime="2026-02-18 05:49:54.870819035 +0000 UTC m=+92.039907287" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.873849 4869 scope.go:117] "RemoveContainer" containerID="d5864f84fcebea715e70c0aab4c21c19ace693f106d0d3edd93278ea4e82a9c9" Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.886327 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.888777 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bjkfs"] Feb 18 05:49:54 crc kubenswrapper[4869]: I0218 05:49:54.909657 4869 scope.go:117] "RemoveContainer" containerID="4f712046b1aed10017948a1b762bf0d966459745aa33143d658a0afc67fa26e8" Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.405691 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.479980 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" path="/var/lib/kubelet/pods/e661f166-1e0a-481d-86d9-6d062411d9db/volumes" Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.858000 4869 generic.go:334] "Generic (PLEG): container finished" podID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerID="7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2" exitCode=0 Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.858060 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerDied","Data":"7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2"} Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.864857 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerID="f79414943cc9d27aa0dc7c1953eb659343b2b7594344e830b0fb579fde25488e" exitCode=0 Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.864926 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerDied","Data":"f79414943cc9d27aa0dc7c1953eb659343b2b7594344e830b0fb579fde25488e"} Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.871378 4869 generic.go:334] "Generic (PLEG): container finished" podID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerID="dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da" exitCode=0 Feb 18 05:49:55 crc kubenswrapper[4869]: I0218 05:49:55.871524 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerDied","Data":"dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da"} Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.878963 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerStarted","Data":"24967964512e784c819a2c86b1cc46cded4a3ca53a682d534ae470b55ec500fa"} Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.881853 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerStarted","Data":"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598"} Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.884403 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerStarted","Data":"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258"} Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.899543 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2q2rk" podStartSLOduration=3.667236262 podStartE2EDuration="50.899525308s" podCreationTimestamp="2026-02-18 05:49:06 +0000 UTC" firstStartedPulling="2026-02-18 05:49:09.059300965 +0000 UTC m=+46.228389237" lastFinishedPulling="2026-02-18 05:49:56.291590041 +0000 UTC m=+93.460678283" observedRunningTime="2026-02-18 05:49:56.895324952 +0000 UTC m=+94.064413184" watchObservedRunningTime="2026-02-18 05:49:56.899525308 +0000 UTC m=+94.068613560" Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.918857 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brm92" podStartSLOduration=2.6058134649999998 podStartE2EDuration="49.918838796s" podCreationTimestamp="2026-02-18 05:49:07 +0000 UTC" firstStartedPulling="2026-02-18 05:49:09.078952843 +0000 UTC m=+46.248041065" lastFinishedPulling="2026-02-18 05:49:56.391978164 +0000 UTC m=+93.561066396" observedRunningTime="2026-02-18 05:49:56.916398514 +0000 UTC m=+94.085486746" watchObservedRunningTime="2026-02-18 05:49:56.918838796 +0000 UTC m=+94.087927038" Feb 18 05:49:56 crc kubenswrapper[4869]: I0218 05:49:56.940599 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vpbh" podStartSLOduration=1.786669293 podStartE2EDuration="47.940583975s" podCreationTimestamp="2026-02-18 05:49:09 +0000 UTC" firstStartedPulling="2026-02-18 05:49:10.111618701 +0000 UTC m=+47.280706933" lastFinishedPulling="2026-02-18 05:49:56.265533373 +0000 UTC m=+93.434621615" observedRunningTime="2026-02-18 05:49:56.940509703 +0000 UTC m=+94.109597935" watchObservedRunningTime="2026-02-18 05:49:56.940583975 +0000 UTC m=+94.109672197" Feb 18 05:49:57 crc kubenswrapper[4869]: I0218 05:49:57.365079 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:57 crc kubenswrapper[4869]: I0218 05:49:57.365142 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:49:57 crc kubenswrapper[4869]: I0218 05:49:57.871183 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:57 crc kubenswrapper[4869]: I0218 05:49:57.871240 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:49:58 crc kubenswrapper[4869]: I0218 05:49:58.400700 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2q2rk" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="registry-server" probeResult="failure" output=< Feb 18 05:49:58 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 05:49:58 crc kubenswrapper[4869]: > Feb 18 05:49:58 crc kubenswrapper[4869]: I0218 05:49:58.906627 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-brm92" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="registry-server" probeResult="failure" output=< Feb 18 05:49:58 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 05:49:58 crc kubenswrapper[4869]: > Feb 18 05:49:59 crc kubenswrapper[4869]: I0218 05:49:59.525471 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:59 crc kubenswrapper[4869]: I0218 05:49:59.525817 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:49:59 crc kubenswrapper[4869]: I0218 05:49:59.569452 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:50:01 crc kubenswrapper[4869]: I0218 05:50:01.508603 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 18 05:50:03 crc kubenswrapper[4869]: I0218 05:50:03.504718 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.504696807 podStartE2EDuration="2.504696807s" podCreationTimestamp="2026-02-18 05:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:50:03.49929495 +0000 UTC m=+100.668383212" watchObservedRunningTime="2026-02-18 05:50:03.504696807 +0000 UTC m=+100.673785039" Feb 18 05:50:07 crc kubenswrapper[4869]: I0218 05:50:07.417486 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:50:07 crc kubenswrapper[4869]: I0218 05:50:07.468635 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:50:07 crc kubenswrapper[4869]: I0218 05:50:07.941919 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:50:08 crc kubenswrapper[4869]: I0218 05:50:08.004500 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.187485 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.187753 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brm92" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="registry-server" containerID="cri-o://bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598" gracePeriod=2 Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.568112 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.586654 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.706155 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities\") pod \"143f68ff-2d5f-435c-8a05-ebc0433fca48\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.706295 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8m7j\" (UniqueName: \"kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j\") pod \"143f68ff-2d5f-435c-8a05-ebc0433fca48\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.706336 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content\") pod \"143f68ff-2d5f-435c-8a05-ebc0433fca48\" (UID: \"143f68ff-2d5f-435c-8a05-ebc0433fca48\") " Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.707156 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities" (OuterVolumeSpecName: "utilities") pod "143f68ff-2d5f-435c-8a05-ebc0433fca48" (UID: "143f68ff-2d5f-435c-8a05-ebc0433fca48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.718953 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j" (OuterVolumeSpecName: "kube-api-access-q8m7j") pod "143f68ff-2d5f-435c-8a05-ebc0433fca48" (UID: "143f68ff-2d5f-435c-8a05-ebc0433fca48"). InnerVolumeSpecName "kube-api-access-q8m7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.756033 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "143f68ff-2d5f-435c-8a05-ebc0433fca48" (UID: "143f68ff-2d5f-435c-8a05-ebc0433fca48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.807269 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.807305 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143f68ff-2d5f-435c-8a05-ebc0433fca48-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.807316 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8m7j\" (UniqueName: \"kubernetes.io/projected/143f68ff-2d5f-435c-8a05-ebc0433fca48-kube-api-access-q8m7j\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.965981 4869 generic.go:334] "Generic (PLEG): container finished" podID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerID="bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598" exitCode=0 Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.966042 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerDied","Data":"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598"} Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.966066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brm92" event={"ID":"143f68ff-2d5f-435c-8a05-ebc0433fca48","Type":"ContainerDied","Data":"b2ab69e8b7ea84355d9e0455728ee1d790e1afe74bcb83051f7af1bfd0ac604f"} Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.966082 4869 scope.go:117] "RemoveContainer" containerID="bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.966184 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brm92" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.994048 4869 scope.go:117] "RemoveContainer" containerID="dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da" Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.994691 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:50:09 crc kubenswrapper[4869]: I0218 05:50:09.997187 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brm92"] Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.014787 4869 scope.go:117] "RemoveContainer" containerID="7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.027923 4869 scope.go:117] "RemoveContainer" containerID="bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598" Feb 18 05:50:10 crc kubenswrapper[4869]: E0218 05:50:10.028343 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598\": container with ID starting with bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598 not found: ID does not exist" containerID="bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.028384 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598"} err="failed to get container status \"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598\": rpc error: code = NotFound desc = could not find container \"bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598\": container with ID starting with bd1906a24b86b80367ee217b46281cc2389a467c03ce1be6f34bc759d95fe598 not found: ID does not exist" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.028424 4869 scope.go:117] "RemoveContainer" containerID="dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da" Feb 18 05:50:10 crc kubenswrapper[4869]: E0218 05:50:10.028723 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da\": container with ID starting with dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da not found: ID does not exist" containerID="dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.028783 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da"} err="failed to get container status \"dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da\": rpc error: code = NotFound desc = could not find container \"dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da\": container with ID starting with dd355f31441fa1fab69c01654f6e6496bc65c7f9d5ed63fa5edd56ddf5a123da not found: ID does not exist" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.028814 4869 scope.go:117] "RemoveContainer" containerID="7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917" Feb 18 05:50:10 crc kubenswrapper[4869]: E0218 05:50:10.029166 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917\": container with ID starting with 7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917 not found: ID does not exist" containerID="7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917" Feb 18 05:50:10 crc kubenswrapper[4869]: I0218 05:50:10.029191 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917"} err="failed to get container status \"7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917\": rpc error: code = NotFound desc = could not find container \"7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917\": container with ID starting with 7c6074869d2c2db1664c926ba08a22ebc407b53d191b385d76346e8532a8e917 not found: ID does not exist" Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.022668 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tthlh"] Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.478785 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" path="/var/lib/kubelet/pods/143f68ff-2d5f-435c-8a05-ebc0433fca48/volumes" Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.585769 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.585979 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5vpbh" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="registry-server" containerID="cri-o://f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258" gracePeriod=2 Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.911873 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.979149 4869 generic.go:334] "Generic (PLEG): container finished" podID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerID="f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258" exitCode=0 Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.979197 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerDied","Data":"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258"} Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.979226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vpbh" event={"ID":"a42348a5-962e-42b1-a0f5-67a89bd0532a","Type":"ContainerDied","Data":"4f3f8a26df68d05568af27c22a622dc304bc20728aa39f3d5a257e076d236c24"} Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.979245 4869 scope.go:117] "RemoveContainer" containerID="f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258" Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.979331 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vpbh" Feb 18 05:50:11 crc kubenswrapper[4869]: I0218 05:50:11.993864 4869 scope.go:117] "RemoveContainer" containerID="7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.006616 4869 scope.go:117] "RemoveContainer" containerID="bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.021819 4869 scope.go:117] "RemoveContainer" containerID="f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258" Feb 18 05:50:12 crc kubenswrapper[4869]: E0218 05:50:12.022106 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258\": container with ID starting with f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258 not found: ID does not exist" containerID="f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.022150 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258"} err="failed to get container status \"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258\": rpc error: code = NotFound desc = could not find container \"f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258\": container with ID starting with f133b1a132f2907992d1e039663df47b5bcfd15b1949f95b8c6830351b410258 not found: ID does not exist" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.022183 4869 scope.go:117] "RemoveContainer" containerID="7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2" Feb 18 05:50:12 crc kubenswrapper[4869]: E0218 05:50:12.022487 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2\": container with ID starting with 7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2 not found: ID does not exist" containerID="7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.022509 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2"} err="failed to get container status \"7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2\": rpc error: code = NotFound desc = could not find container \"7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2\": container with ID starting with 7ed4f109c893d92a4c4df3ce9f222ab88044d96c8240c612533a9089893ac5c2 not found: ID does not exist" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.022526 4869 scope.go:117] "RemoveContainer" containerID="bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37" Feb 18 05:50:12 crc kubenswrapper[4869]: E0218 05:50:12.022774 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37\": container with ID starting with bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37 not found: ID does not exist" containerID="bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.022804 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37"} err="failed to get container status \"bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37\": rpc error: code = NotFound desc = could not find container \"bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37\": container with ID starting with bd597523762c7b7ff787b2238bf48a9ebe1d39cc01d2a8591fd90d32171f7e37 not found: ID does not exist" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.033261 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8gzp\" (UniqueName: \"kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp\") pod \"a42348a5-962e-42b1-a0f5-67a89bd0532a\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.033312 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content\") pod \"a42348a5-962e-42b1-a0f5-67a89bd0532a\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.033336 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities\") pod \"a42348a5-962e-42b1-a0f5-67a89bd0532a\" (UID: \"a42348a5-962e-42b1-a0f5-67a89bd0532a\") " Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.034166 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities" (OuterVolumeSpecName: "utilities") pod "a42348a5-962e-42b1-a0f5-67a89bd0532a" (UID: "a42348a5-962e-42b1-a0f5-67a89bd0532a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.038386 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp" (OuterVolumeSpecName: "kube-api-access-b8gzp") pod "a42348a5-962e-42b1-a0f5-67a89bd0532a" (UID: "a42348a5-962e-42b1-a0f5-67a89bd0532a"). InnerVolumeSpecName "kube-api-access-b8gzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.057368 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a42348a5-962e-42b1-a0f5-67a89bd0532a" (UID: "a42348a5-962e-42b1-a0f5-67a89bd0532a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.134512 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8gzp\" (UniqueName: \"kubernetes.io/projected/a42348a5-962e-42b1-a0f5-67a89bd0532a-kube-api-access-b8gzp\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.134544 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.134553 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a42348a5-962e-42b1-a0f5-67a89bd0532a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.313490 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:50:12 crc kubenswrapper[4869]: I0218 05:50:12.321864 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vpbh"] Feb 18 05:50:13 crc kubenswrapper[4869]: I0218 05:50:13.478339 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" path="/var/lib/kubelet/pods/a42348a5-962e-42b1-a0f5-67a89bd0532a/volumes" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.192583 4869 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193731 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193747 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193757 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193763 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193782 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193788 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193802 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193807 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193815 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193822 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193830 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193836 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="extract-utilities" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193845 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193850 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193858 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193864 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.193871 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193877 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="extract-content" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193958 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="143f68ff-2d5f-435c-8a05-ebc0433fca48" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193966 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42348a5-962e-42b1-a0f5-67a89bd0532a" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.193976 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e661f166-1e0a-481d-86d9-6d062411d9db" containerName="registry-server" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194263 4869 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194283 4869 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194382 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194390 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194401 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194406 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194415 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194421 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194428 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194433 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194440 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194445 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194456 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194461 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.194471 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194477 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194550 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194560 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194566 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194574 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194582 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.194588 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.195267 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.195814 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba" gracePeriod=15 Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.195927 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d" gracePeriod=15 Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.195961 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b" gracePeriod=15 Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.195991 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd" gracePeriod=15 Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.196019 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2" gracePeriod=15 Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.200592 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.238283 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282563 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282633 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282682 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282720 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282748 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282809 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282844 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.282942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387471 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387307 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387588 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387527 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387626 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387665 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387686 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387712 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387735 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387815 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387837 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.387850 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: I0218 05:50:32.538338 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:32 crc kubenswrapper[4869]: E0218 05:50:32.565502 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895414e3355ffa3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,LastTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.090732 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.091894 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.093542 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d" exitCode=0 Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.093573 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b" exitCode=0 Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.093589 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd" exitCode=0 Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.093615 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2" exitCode=2 Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.093637 4869 scope.go:117] "RemoveContainer" containerID="98db281657093f71f44016c866f6e5bae319d20d6457df6f0f5adec6f4af40f2" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.096046 4869 generic.go:334] "Generic (PLEG): container finished" podID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" containerID="d572ed22414e1f6700dffa5a2a21451daf5ef58a6fb4eda887b77db4e14aa617" exitCode=0 Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.096082 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3dab06c-6d1a-4665-a4a6-7549071c8b13","Type":"ContainerDied","Data":"d572ed22414e1f6700dffa5a2a21451daf5ef58a6fb4eda887b77db4e14aa617"} Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.096930 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.097257 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.098142 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5eba024d87fdb53e76633eaafa94471662563dee017954fe1d8494df8f4d49e2"} Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.098175 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"05d483247235a30ad11f9effe236c65e0c2a6c10d4e13c857ed03353b27747d9"} Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.098631 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.099123 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.471963 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: I0218 05:50:33.472466 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:33 crc kubenswrapper[4869]: E0218 05:50:33.491532 4869 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" volumeName="registry-storage" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.108415 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.364893 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.365421 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.365638 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418646 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock\") pod \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418702 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir\") pod \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418730 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock" (OuterVolumeSpecName: "var-lock") pod "d3dab06c-6d1a-4665-a4a6-7549071c8b13" (UID: "d3dab06c-6d1a-4665-a4a6-7549071c8b13"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418763 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access\") pod \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\" (UID: \"d3dab06c-6d1a-4665-a4a6-7549071c8b13\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418933 4869 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.418952 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3dab06c-6d1a-4665-a4a6-7549071c8b13" (UID: "d3dab06c-6d1a-4665-a4a6-7549071c8b13"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.438058 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3dab06c-6d1a-4665-a4a6-7549071c8b13" (UID: "d3dab06c-6d1a-4665-a4a6-7549071c8b13"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.520439 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.520477 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3dab06c-6d1a-4665-a4a6-7549071c8b13-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.546992 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.548060 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.548628 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.548995 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.549255 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621232 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621290 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621302 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621332 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621403 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621445 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621482 4869 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.621498 4869 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:34 crc kubenswrapper[4869]: I0218 05:50:34.722506 4869 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.115296 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3dab06c-6d1a-4665-a4a6-7549071c8b13","Type":"ContainerDied","Data":"afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe"} Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.115333 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc669edd33f1c7a94e8b437dbed535ecbbad071bb7b6fc683027ec03505c5fe" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.115335 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.117930 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.130360 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.130551 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.130713 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.132001 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba" exitCode=0 Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.132066 4869 scope.go:117] "RemoveContainer" containerID="379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.132168 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.147633 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.148413 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.148667 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.154369 4869 scope.go:117] "RemoveContainer" containerID="75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.167654 4869 scope.go:117] "RemoveContainer" containerID="29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.188487 4869 scope.go:117] "RemoveContainer" containerID="325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.200978 4869 scope.go:117] "RemoveContainer" containerID="887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.216755 4869 scope.go:117] "RemoveContainer" containerID="db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.233747 4869 scope.go:117] "RemoveContainer" containerID="379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.234188 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d\": container with ID starting with 379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d not found: ID does not exist" containerID="379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234235 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d"} err="failed to get container status \"379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d\": rpc error: code = NotFound desc = could not find container \"379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d\": container with ID starting with 379c3d476e040da1ccc05cff51bae550969be8fd3ea536ebb7b2f19aab5aae7d not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234263 4869 scope.go:117] "RemoveContainer" containerID="75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.234544 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\": container with ID starting with 75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b not found: ID does not exist" containerID="75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234583 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b"} err="failed to get container status \"75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\": rpc error: code = NotFound desc = could not find container \"75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b\": container with ID starting with 75b90d1a9774d318ea853311692c2dde1863497547c151b7e7826531868e619b not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234612 4869 scope.go:117] "RemoveContainer" containerID="29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.234878 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\": container with ID starting with 29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd not found: ID does not exist" containerID="29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234899 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd"} err="failed to get container status \"29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\": rpc error: code = NotFound desc = could not find container \"29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd\": container with ID starting with 29c88479e93208a5a7ce906e7547c7978ea44ed40294b49082c24c97c7c719dd not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.234914 4869 scope.go:117] "RemoveContainer" containerID="325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.235142 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\": container with ID starting with 325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2 not found: ID does not exist" containerID="325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.235156 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2"} err="failed to get container status \"325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\": rpc error: code = NotFound desc = could not find container \"325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2\": container with ID starting with 325ec31cd94a8e94900dbf516361ff49318d8d866df876657162a22ac4efefb2 not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.235169 4869 scope.go:117] "RemoveContainer" containerID="887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.235378 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\": container with ID starting with 887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba not found: ID does not exist" containerID="887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.235396 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba"} err="failed to get container status \"887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\": rpc error: code = NotFound desc = could not find container \"887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba\": container with ID starting with 887726f89bfa0de3b913d9c306c514ff0169d4c6029c5a59fbf6d9f6ed9d22ba not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.235410 4869 scope.go:117] "RemoveContainer" containerID="db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301" Feb 18 05:50:35 crc kubenswrapper[4869]: E0218 05:50:35.235661 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\": container with ID starting with db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301 not found: ID does not exist" containerID="db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.235680 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301"} err="failed to get container status \"db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\": rpc error: code = NotFound desc = could not find container \"db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301\": container with ID starting with db2fa5a8e1ca17324ec5e23c6dc900b1f40e4da54f3cbedf2ae80f6e89c47301 not found: ID does not exist" Feb 18 05:50:35 crc kubenswrapper[4869]: I0218 05:50:35.478235 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.045743 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerName="oauth-openshift" containerID="cri-o://d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6" gracePeriod=15 Feb 18 05:50:36 crc kubenswrapper[4869]: E0218 05:50:36.134768 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895414e3355ffa3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,LastTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.386653 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.387333 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.387938 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.388520 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.440967 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441066 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdgn7\" (UniqueName: \"kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441102 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441126 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441158 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441197 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441229 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441258 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441274 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441298 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441324 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441355 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441377 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441401 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection\") pod \"d72ef0c5-fb30-4d98-9237-a992acf49959\" (UID: \"d72ef0c5-fb30-4d98-9237-a992acf49959\") " Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441463 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.441583 4869 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.442069 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.442223 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.442441 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.442689 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.447936 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.448237 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.448369 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.448870 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.448919 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7" (OuterVolumeSpecName: "kube-api-access-qdgn7") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "kube-api-access-qdgn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.449104 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.449198 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.449454 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.451921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d72ef0c5-fb30-4d98-9237-a992acf49959" (UID: "d72ef0c5-fb30-4d98-9237-a992acf49959"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542498 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542539 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542549 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542557 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542570 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542579 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542587 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542596 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542605 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdgn7\" (UniqueName: \"kubernetes.io/projected/d72ef0c5-fb30-4d98-9237-a992acf49959-kube-api-access-qdgn7\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542613 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542622 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542630 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:36 crc kubenswrapper[4869]: I0218 05:50:36.542638 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d72ef0c5-fb30-4d98-9237-a992acf49959-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.146360 4869 generic.go:334] "Generic (PLEG): container finished" podID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerID="d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6" exitCode=0 Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.146396 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" event={"ID":"d72ef0c5-fb30-4d98-9237-a992acf49959","Type":"ContainerDied","Data":"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6"} Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.146426 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" event={"ID":"d72ef0c5-fb30-4d98-9237-a992acf49959","Type":"ContainerDied","Data":"381edd3bf077c5e6352d06f632c3e6af54b1281337157000ed4a2d5506f4a5ad"} Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.146444 4869 scope.go:117] "RemoveContainer" containerID="d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.146535 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.147153 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.147443 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.147791 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.176571 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.177477 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.178119 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.179994 4869 scope.go:117] "RemoveContainer" containerID="d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6" Feb 18 05:50:37 crc kubenswrapper[4869]: E0218 05:50:37.180544 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6\": container with ID starting with d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6 not found: ID does not exist" containerID="d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6" Feb 18 05:50:37 crc kubenswrapper[4869]: I0218 05:50:37.180571 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6"} err="failed to get container status \"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6\": rpc error: code = NotFound desc = could not find container \"d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6\": container with ID starting with d2aa7a5a5f943b63ee723f45044fe5ed6ee0976a70e3ee41341a0a72f512fae6 not found: ID does not exist" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.182627 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.183704 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.184081 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.184412 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.184957 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: I0218 05:50:41.184988 4869 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.185215 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.386128 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.447573 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T05:50:41Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T05:50:41Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T05:50:41Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T05:50:41Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.448134 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.448589 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.449099 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.449601 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.449647 4869 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 05:50:41 crc kubenswrapper[4869]: E0218 05:50:41.787316 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Feb 18 05:50:42 crc kubenswrapper[4869]: E0218 05:50:42.588125 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Feb 18 05:50:43 crc kubenswrapper[4869]: I0218 05:50:43.476261 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:43 crc kubenswrapper[4869]: I0218 05:50:43.477244 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:43 crc kubenswrapper[4869]: I0218 05:50:43.477660 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:44 crc kubenswrapper[4869]: E0218 05:50:44.189271 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Feb 18 05:50:46 crc kubenswrapper[4869]: E0218 05:50:46.136315 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895414e3355ffa3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,LastTimestamp:2026-02-18 05:50:32.564686755 +0000 UTC m=+129.733774987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.469189 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.471036 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.471631 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.472138 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.492082 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.492120 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:46 crc kubenswrapper[4869]: E0218 05:50:46.492658 4869 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:46 crc kubenswrapper[4869]: I0218 05:50:46.493104 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.206917 4869 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="030f4ab2c9b7f6c5f98d3cadcd3fcf3c569232baf439d47030d2cc5af7936cc8" exitCode=0 Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.206991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"030f4ab2c9b7f6c5f98d3cadcd3fcf3c569232baf439d47030d2cc5af7936cc8"} Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.207516 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2fd92ebb5343eafc9fbef215163d056993d48898afcb00ee09e36b1872b2c0c2"} Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.207964 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.207990 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:47 crc kubenswrapper[4869]: E0218 05:50:47.208974 4869 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.209046 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.209521 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.209728 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.211382 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.211423 4869 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="91ff9dbe190f88939f4cb4bb1b2b925b0585a8d1de7adc355da2eeecd3f1eb33" exitCode=1 Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.211442 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"91ff9dbe190f88939f4cb4bb1b2b925b0585a8d1de7adc355da2eeecd3f1eb33"} Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.211718 4869 scope.go:117] "RemoveContainer" containerID="91ff9dbe190f88939f4cb4bb1b2b925b0585a8d1de7adc355da2eeecd3f1eb33" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.212232 4869 status_manager.go:851] "Failed to get status for pod" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" pod="openshift-authentication/oauth-openshift-558db77b4-tthlh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tthlh\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.212397 4869 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.212542 4869 status_manager.go:851] "Failed to get status for pod" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: I0218 05:50:47.212684 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Feb 18 05:50:47 crc kubenswrapper[4869]: E0218 05:50:47.390853 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="6.4s" Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.218874 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"76d121728556f74545f4c758b53703b92e467028b9d727df81eeb306f3183992"} Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.219841 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4df5b48781facdb4eeb40d21922728d4a05999271ab803415ca5c26b6cd86fb4"} Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.219859 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c96b559d57a979efbea52b49dea51b7bd9473e5b2784688525d9183a94e9e22b"} Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.219869 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"397cbd55fbcc25825d7d16e6bab32d6947891206797a357e315539ef01c7efa8"} Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.221347 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 05:50:48 crc kubenswrapper[4869]: I0218 05:50:48.221390 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6df2629a5b27d3fb9576432b2dfd9384f1e329dbd7d48b459e5bac1f21628b8"} Feb 18 05:50:49 crc kubenswrapper[4869]: I0218 05:50:49.228264 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e6810628aa760f187d460f261061ddb4920c24278430c794b35aa3077802a9c7"} Feb 18 05:50:49 crc kubenswrapper[4869]: I0218 05:50:49.228442 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:49 crc kubenswrapper[4869]: I0218 05:50:49.228502 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:49 crc kubenswrapper[4869]: I0218 05:50:49.228527 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:51 crc kubenswrapper[4869]: I0218 05:50:51.493787 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:51 crc kubenswrapper[4869]: I0218 05:50:51.494108 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:51 crc kubenswrapper[4869]: I0218 05:50:51.498207 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:52 crc kubenswrapper[4869]: I0218 05:50:52.674190 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:53 crc kubenswrapper[4869]: I0218 05:50:53.891091 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:53 crc kubenswrapper[4869]: I0218 05:50:53.891896 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 18 05:50:53 crc kubenswrapper[4869]: I0218 05:50:53.892021 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 18 05:50:54 crc kubenswrapper[4869]: I0218 05:50:54.236732 4869 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:54 crc kubenswrapper[4869]: I0218 05:50:54.252528 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:54 crc kubenswrapper[4869]: I0218 05:50:54.252559 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:54 crc kubenswrapper[4869]: I0218 05:50:54.256441 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:54 crc kubenswrapper[4869]: I0218 05:50:54.258766 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ae2d946c-12e0-4c80-b8ad-5cce2cd2e686" Feb 18 05:50:55 crc kubenswrapper[4869]: I0218 05:50:55.256162 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:50:55 crc kubenswrapper[4869]: I0218 05:50:55.256187 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7e7e81a8-99d2-4752-8198-fbf3f6bfa860" Feb 18 05:51:03 crc kubenswrapper[4869]: I0218 05:51:03.486266 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ae2d946c-12e0-4c80-b8ad-5cce2cd2e686" Feb 18 05:51:03 crc kubenswrapper[4869]: I0218 05:51:03.891971 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 18 05:51:03 crc kubenswrapper[4869]: I0218 05:51:03.892060 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 18 05:51:03 crc kubenswrapper[4869]: I0218 05:51:03.995608 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.166107 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.505731 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.541597 4869 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.542904 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.542892031 podStartE2EDuration="32.542892031s" podCreationTimestamp="2026-02-18 05:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:50:53.896366513 +0000 UTC m=+151.065454755" watchObservedRunningTime="2026-02-18 05:51:04.542892031 +0000 UTC m=+161.711980263" Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.545464 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-tthlh"] Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.545509 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.551897 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:51:04 crc kubenswrapper[4869]: I0218 05:51:04.565828 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=10.56581152 podStartE2EDuration="10.56581152s" podCreationTimestamp="2026-02-18 05:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:04.562284864 +0000 UTC m=+161.731373106" watchObservedRunningTime="2026-02-18 05:51:04.56581152 +0000 UTC m=+161.734899782" Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.180287 4869 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.180656 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5eba024d87fdb53e76633eaafa94471662563dee017954fe1d8494df8f4d49e2" gracePeriod=5 Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.209353 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.476291 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" path="/var/lib/kubelet/pods/d72ef0c5-fb30-4d98-9237-a992acf49959/volumes" Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.487326 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 05:51:05 crc kubenswrapper[4869]: I0218 05:51:05.943126 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.000329 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.013840 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.032591 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.047206 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.048370 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.270684 4869 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.368919 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.375967 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.388292 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.670870 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.850274 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.879220 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.894892 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 05:51:06 crc kubenswrapper[4869]: I0218 05:51:06.943657 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.025398 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.146204 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.195367 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.196411 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.244442 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.325287 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.328219 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.351423 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.398535 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.465307 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.525534 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.585054 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.603283 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.611320 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.722237 4869 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.813546 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:07 crc kubenswrapper[4869]: I0218 05:51:07.916965 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.018410 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.062720 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.079256 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.230502 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.342811 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.450091 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.550508 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.551800 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.574699 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.602716 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.648682 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.682484 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.704635 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.791891 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.800848 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 05:51:08 crc kubenswrapper[4869]: I0218 05:51:08.984108 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.015458 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.029948 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.123044 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.187471 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.195845 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.274424 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.360674 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.485079 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.567499 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.672078 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.761327 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.852762 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.864867 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.919482 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.954582 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.958789 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.966095 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.987510 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 05:51:09 crc kubenswrapper[4869]: I0218 05:51:09.992073 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.008886 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.031487 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.107117 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.125714 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.132489 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.132527 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.137824 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.140007 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.151759 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn"] Feb 18 05:51:10 crc kubenswrapper[4869]: E0218 05:51:10.151971 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.151991 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:10 crc kubenswrapper[4869]: E0218 05:51:10.152002 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" containerName="installer" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152013 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" containerName="installer" Feb 18 05:51:10 crc kubenswrapper[4869]: E0218 05:51:10.152032 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerName="oauth-openshift" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152040 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerName="oauth-openshift" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152139 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72ef0c5-fb30-4d98-9237-a992acf49959" containerName="oauth-openshift" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152151 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152160 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dab06c-6d1a-4665-a4a6-7549071c8b13" containerName="installer" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.152561 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.171764 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.171951 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.172087 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.172284 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.172405 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.172522 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.173105 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.173369 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.173535 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.173761 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.174200 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.174810 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.184512 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn"] Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.195574 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.195988 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.222842 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248157 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248460 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248482 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248507 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248541 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248557 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsbp\" (UniqueName: \"kubernetes.io/projected/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-kube-api-access-qtsbp\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248575 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248676 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248959 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.248982 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.249030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.249047 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.291384 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.339976 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.340245 4869 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5eba024d87fdb53e76633eaafa94471662563dee017954fe1d8494df8f4d49e2" exitCode=137 Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.345777 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.349637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.349870 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.349981 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350078 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350205 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350303 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350387 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350483 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350599 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350799 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsbp\" (UniqueName: \"kubernetes.io/projected/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-kube-api-access-qtsbp\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350943 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.351081 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.351235 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350538 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.351100 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350867 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.350987 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.351583 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.355507 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.355514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.355820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.356131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.356209 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.357891 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.358144 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.359305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.364157 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.366131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsbp\" (UniqueName: \"kubernetes.io/projected/2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3-kube-api-access-qtsbp\") pod \"oauth-openshift-5fff7d8cf9-qpnkn\" (UID: \"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.380782 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.415516 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.420218 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.456578 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.465234 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.465938 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.474160 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.497012 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.533687 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.549326 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.603680 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.644332 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.720874 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.735093 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.735165 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.745730 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759119 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759209 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759244 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759244 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759273 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759344 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759340 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759404 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759547 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759573 4869 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759593 4869 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.759605 4869 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.765139 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.796453 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.835880 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.854777 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn"] Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.860807 4869 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.860834 4869 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:10 crc kubenswrapper[4869]: W0218 05:51:10.867654 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7d9b12_c7b7_4efd_bc2e_8272331dc3e3.slice/crio-29e1bb04bc482014178e438fa3cf25784343027305557bd6c8430e9dee7a3b09 WatchSource:0}: Error finding container 29e1bb04bc482014178e438fa3cf25784343027305557bd6c8430e9dee7a3b09: Status 404 returned error can't find the container with id 29e1bb04bc482014178e438fa3cf25784343027305557bd6c8430e9dee7a3b09 Feb 18 05:51:10 crc kubenswrapper[4869]: I0218 05:51:10.998354 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.049557 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.166061 4869 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.207263 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.248006 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.311268 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.346438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" event={"ID":"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3","Type":"ContainerStarted","Data":"a4fcb91418fd94991f7fbb87571f63ba65cfcd6e57aed0f6487f161d57176709"} Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.346478 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" event={"ID":"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3","Type":"ContainerStarted","Data":"29e1bb04bc482014178e438fa3cf25784343027305557bd6c8430e9dee7a3b09"} Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.346907 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.349057 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.349104 4869 scope.go:117] "RemoveContainer" containerID="5eba024d87fdb53e76633eaafa94471662563dee017954fe1d8494df8f4d49e2" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.349175 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.369438 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" podStartSLOduration=60.369422748 podStartE2EDuration="1m0.369422748s" podCreationTimestamp="2026-02-18 05:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:11.368363072 +0000 UTC m=+168.537451304" watchObservedRunningTime="2026-02-18 05:51:11.369422748 +0000 UTC m=+168.538510980" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.465903 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.478274 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.478515 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.491536 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.491579 4869 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="48dc4623-4ce5-4293-bc7a-b0c51f693397" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.491599 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.491606 4869 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="48dc4623-4ce5-4293-bc7a-b0c51f693397" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.700844 4869 patch_prober.go:28] interesting pod/oauth-openshift-5fff7d8cf9-qpnkn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:33388->10.217.0.56:6443: read: connection reset by peer" start-of-body= Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.700921 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" podUID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:33388->10.217.0.56:6443: read: connection reset by peer" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.733118 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.758918 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.767386 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.820541 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.902146 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.938462 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.952966 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.966921 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 05:51:11 crc kubenswrapper[4869]: I0218 05:51:11.987730 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.065047 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.159963 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.161482 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.364362 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5fff7d8cf9-qpnkn_2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3/oauth-openshift/0.log" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.365200 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" containerID="a4fcb91418fd94991f7fbb87571f63ba65cfcd6e57aed0f6487f161d57176709" exitCode=255 Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.365439 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" event={"ID":"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3","Type":"ContainerDied","Data":"a4fcb91418fd94991f7fbb87571f63ba65cfcd6e57aed0f6487f161d57176709"} Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.366869 4869 scope.go:117] "RemoveContainer" containerID="a4fcb91418fd94991f7fbb87571f63ba65cfcd6e57aed0f6487f161d57176709" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.535913 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.536116 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.536242 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.536436 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.536549 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.538770 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.614107 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.641984 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.708060 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.893788 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.903967 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 05:51:12 crc kubenswrapper[4869]: I0218 05:51:12.974965 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.105615 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.170324 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.190738 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.244219 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.373386 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5fff7d8cf9-qpnkn_2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3/oauth-openshift/1.log" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.374014 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5fff7d8cf9-qpnkn_2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3/oauth-openshift/0.log" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.374064 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" containerID="a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46" exitCode=255 Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.374086 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" event={"ID":"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3","Type":"ContainerDied","Data":"a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46"} Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.374122 4869 scope.go:117] "RemoveContainer" containerID="a4fcb91418fd94991f7fbb87571f63ba65cfcd6e57aed0f6487f161d57176709" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.374609 4869 scope.go:117] "RemoveContainer" containerID="a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46" Feb 18 05:51:13 crc kubenswrapper[4869]: E0218 05:51:13.374822 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5fff7d8cf9-qpnkn_openshift-authentication(2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3)\"" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" podUID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.413897 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.449062 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.468433 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.508328 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.509617 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.510539 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.582027 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.626820 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.646989 4869 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.706558 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.819500 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.861534 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.896178 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.900955 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.904913 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.941707 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.956971 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:51:13 crc kubenswrapper[4869]: I0218 05:51:13.976852 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.004990 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.069580 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.155665 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.307100 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.368845 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.378109 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.383534 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5fff7d8cf9-qpnkn_2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3/oauth-openshift/1.log" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.384703 4869 scope.go:117] "RemoveContainer" containerID="a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46" Feb 18 05:51:14 crc kubenswrapper[4869]: E0218 05:51:14.385087 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5fff7d8cf9-qpnkn_openshift-authentication(2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3)\"" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" podUID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.412327 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.480499 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.515636 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.533116 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.575986 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.582110 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.620459 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.744526 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.780828 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.885907 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.963549 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 05:51:14 crc kubenswrapper[4869]: I0218 05:51:14.996050 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.048058 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.134836 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.155005 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.194229 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.236712 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.280159 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.296981 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.432128 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.456254 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.503533 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.505723 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.546200 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.617150 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.690495 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.741031 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.765655 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.794142 4869 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.813246 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.926986 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4869]: I0218 05:51:15.946622 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.017801 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.023167 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.092966 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.135834 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.175292 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.239447 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.275301 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.376151 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.503390 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.610234 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.649903 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.670184 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.751488 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.884387 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 05:51:16 crc kubenswrapper[4869]: I0218 05:51:16.936586 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.175609 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.225118 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.243450 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.245620 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.289654 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.327495 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.401659 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.432589 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.519144 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.634073 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.675731 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.791279 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.806368 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 05:51:17 crc kubenswrapper[4869]: I0218 05:51:17.835182 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.135531 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.165911 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.198994 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.271883 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.381914 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.384352 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.550610 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.551517 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.726384 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 05:51:18 crc kubenswrapper[4869]: I0218 05:51:18.757017 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.045637 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.050794 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.114479 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.142371 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.206239 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.556732 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.671901 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 05:51:19 crc kubenswrapper[4869]: I0218 05:51:19.706182 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:51:20 crc kubenswrapper[4869]: I0218 05:51:20.189297 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 05:51:20 crc kubenswrapper[4869]: I0218 05:51:20.474939 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:20 crc kubenswrapper[4869]: I0218 05:51:20.474999 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:20 crc kubenswrapper[4869]: I0218 05:51:20.475961 4869 scope.go:117] "RemoveContainer" containerID="a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46" Feb 18 05:51:20 crc kubenswrapper[4869]: E0218 05:51:20.476368 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5fff7d8cf9-qpnkn_openshift-authentication(2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3)\"" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" podUID="2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3" Feb 18 05:51:21 crc kubenswrapper[4869]: I0218 05:51:21.155428 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 05:51:33 crc kubenswrapper[4869]: I0218 05:51:33.475420 4869 scope.go:117] "RemoveContainer" containerID="a8ff39d301b678823c7113999d0da8ed7e2a9b2c9753b43277ef600ffa56ff46" Feb 18 05:51:34 crc kubenswrapper[4869]: I0218 05:51:34.494112 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5fff7d8cf9-qpnkn_2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3/oauth-openshift/1.log" Feb 18 05:51:34 crc kubenswrapper[4869]: I0218 05:51:34.494455 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" event={"ID":"2f7d9b12-c7b7-4efd-bc2e-8272331dc3e3","Type":"ContainerStarted","Data":"918489b70929a86d969cec53fde3afac7ad8731022dd8f04d07c6475b8db4397"} Feb 18 05:51:34 crc kubenswrapper[4869]: I0218 05:51:34.495119 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:34 crc kubenswrapper[4869]: I0218 05:51:34.503897 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-qpnkn" Feb 18 05:51:34 crc kubenswrapper[4869]: I0218 05:51:34.850227 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 05:51:36 crc kubenswrapper[4869]: I0218 05:51:36.023217 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 05:51:38 crc kubenswrapper[4869]: I0218 05:51:38.520987 4869 generic.go:334] "Generic (PLEG): container finished" podID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerID="95da8d9bc2115bb8279152a8d9f1d2cb19e38f7fb924557c084d8af071fe7c8f" exitCode=0 Feb 18 05:51:38 crc kubenswrapper[4869]: I0218 05:51:38.521137 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerDied","Data":"95da8d9bc2115bb8279152a8d9f1d2cb19e38f7fb924557c084d8af071fe7c8f"} Feb 18 05:51:38 crc kubenswrapper[4869]: I0218 05:51:38.521941 4869 scope.go:117] "RemoveContainer" containerID="95da8d9bc2115bb8279152a8d9f1d2cb19e38f7fb924557c084d8af071fe7c8f" Feb 18 05:51:39 crc kubenswrapper[4869]: I0218 05:51:39.529011 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerStarted","Data":"c2309e339033914a8062d10b3a5234bd482515f15c25d7d37e52e6e46ddb18d0"} Feb 18 05:51:39 crc kubenswrapper[4869]: I0218 05:51:39.529601 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:51:39 crc kubenswrapper[4869]: I0218 05:51:39.536799 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:51:40 crc kubenswrapper[4869]: I0218 05:51:40.132416 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:51:40 crc kubenswrapper[4869]: I0218 05:51:40.132789 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.545305 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.546164 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerName="controller-manager" containerID="cri-o://47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8" gracePeriod=30 Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.606732 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.607406 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerName="route-controller-manager" containerID="cri-o://79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff" gracePeriod=30 Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.912857 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:51:51 crc kubenswrapper[4869]: I0218 05:51:51.961877 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072594 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert\") pod \"5a9ba96f-26e6-4870-9e59-9735f210eef3\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072670 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config\") pod \"c53da5b2-9e42-4160-8ed1-2600d9e76880\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072698 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5hr\" (UniqueName: \"kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr\") pod \"5a9ba96f-26e6-4870-9e59-9735f210eef3\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072735 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca\") pod \"c53da5b2-9e42-4160-8ed1-2600d9e76880\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072796 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca\") pod \"5a9ba96f-26e6-4870-9e59-9735f210eef3\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072825 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config\") pod \"5a9ba96f-26e6-4870-9e59-9735f210eef3\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072846 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbkch\" (UniqueName: \"kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch\") pod \"c53da5b2-9e42-4160-8ed1-2600d9e76880\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072878 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert\") pod \"c53da5b2-9e42-4160-8ed1-2600d9e76880\" (UID: \"c53da5b2-9e42-4160-8ed1-2600d9e76880\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.072909 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles\") pod \"5a9ba96f-26e6-4870-9e59-9735f210eef3\" (UID: \"5a9ba96f-26e6-4870-9e59-9735f210eef3\") " Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.073922 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5a9ba96f-26e6-4870-9e59-9735f210eef3" (UID: "5a9ba96f-26e6-4870-9e59-9735f210eef3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.073996 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a9ba96f-26e6-4870-9e59-9735f210eef3" (UID: "5a9ba96f-26e6-4870-9e59-9735f210eef3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.074010 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config" (OuterVolumeSpecName: "config") pod "5a9ba96f-26e6-4870-9e59-9735f210eef3" (UID: "5a9ba96f-26e6-4870-9e59-9735f210eef3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.074575 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca" (OuterVolumeSpecName: "client-ca") pod "c53da5b2-9e42-4160-8ed1-2600d9e76880" (UID: "c53da5b2-9e42-4160-8ed1-2600d9e76880"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.075143 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config" (OuterVolumeSpecName: "config") pod "c53da5b2-9e42-4160-8ed1-2600d9e76880" (UID: "c53da5b2-9e42-4160-8ed1-2600d9e76880"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.078965 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr" (OuterVolumeSpecName: "kube-api-access-8w5hr") pod "5a9ba96f-26e6-4870-9e59-9735f210eef3" (UID: "5a9ba96f-26e6-4870-9e59-9735f210eef3"). InnerVolumeSpecName "kube-api-access-8w5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.079077 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c53da5b2-9e42-4160-8ed1-2600d9e76880" (UID: "c53da5b2-9e42-4160-8ed1-2600d9e76880"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.079111 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a9ba96f-26e6-4870-9e59-9735f210eef3" (UID: "5a9ba96f-26e6-4870-9e59-9735f210eef3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.079539 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch" (OuterVolumeSpecName: "kube-api-access-wbkch") pod "c53da5b2-9e42-4160-8ed1-2600d9e76880" (UID: "c53da5b2-9e42-4160-8ed1-2600d9e76880"). InnerVolumeSpecName "kube-api-access-wbkch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174030 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174070 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbkch\" (UniqueName: \"kubernetes.io/projected/c53da5b2-9e42-4160-8ed1-2600d9e76880-kube-api-access-wbkch\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174086 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174099 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53da5b2-9e42-4160-8ed1-2600d9e76880-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174110 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a9ba96f-26e6-4870-9e59-9735f210eef3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174120 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9ba96f-26e6-4870-9e59-9735f210eef3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174129 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174139 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5hr\" (UniqueName: \"kubernetes.io/projected/5a9ba96f-26e6-4870-9e59-9735f210eef3-kube-api-access-8w5hr\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.174154 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53da5b2-9e42-4160-8ed1-2600d9e76880-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.602891 4869 generic.go:334] "Generic (PLEG): container finished" podID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerID="79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff" exitCode=0 Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.602991 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.602966 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" event={"ID":"c53da5b2-9e42-4160-8ed1-2600d9e76880","Type":"ContainerDied","Data":"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff"} Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.605295 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522" event={"ID":"c53da5b2-9e42-4160-8ed1-2600d9e76880","Type":"ContainerDied","Data":"ea9d1afc23b7243ed492a555e2c05fd770560b93a2a719d5c9c8693f8996d1a8"} Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.605335 4869 scope.go:117] "RemoveContainer" containerID="79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.607516 4869 generic.go:334] "Generic (PLEG): container finished" podID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerID="47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8" exitCode=0 Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.607565 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" event={"ID":"5a9ba96f-26e6-4870-9e59-9735f210eef3","Type":"ContainerDied","Data":"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8"} Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.607597 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" event={"ID":"5a9ba96f-26e6-4870-9e59-9735f210eef3","Type":"ContainerDied","Data":"f38b5769e49725124cebffcacd874108dc7086df671e943c95d4d09f43a2df8d"} Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.607868 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9m6gl" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.626341 4869 scope.go:117] "RemoveContainer" containerID="79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff" Feb 18 05:51:52 crc kubenswrapper[4869]: E0218 05:51:52.626732 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff\": container with ID starting with 79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff not found: ID does not exist" containerID="79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.626809 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff"} err="failed to get container status \"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff\": rpc error: code = NotFound desc = could not find container \"79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff\": container with ID starting with 79ece397e7118651c97eb5815b2ed3b6c8bf7639c5bb8b2213ddefd307c01aff not found: ID does not exist" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.626844 4869 scope.go:117] "RemoveContainer" containerID="47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.646503 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.650255 4869 scope.go:117] "RemoveContainer" containerID="47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.650471 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9m6gl"] Feb 18 05:51:52 crc kubenswrapper[4869]: E0218 05:51:52.651225 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8\": container with ID starting with 47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8 not found: ID does not exist" containerID="47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.651384 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8"} err="failed to get container status \"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8\": rpc error: code = NotFound desc = could not find container \"47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8\": container with ID starting with 47ab904e33db53c2142ea766fb4f508b82a40410721e2daab19618a5abc034e8 not found: ID does not exist" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.693277 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.697322 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7522"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.877386 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:51:52 crc kubenswrapper[4869]: E0218 05:51:52.877690 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerName="route-controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.877707 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerName="route-controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: E0218 05:51:52.877728 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerName="controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.877736 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerName="controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.877864 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" containerName="route-controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.877887 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" containerName="controller-manager" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.878364 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.882242 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.882532 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.883016 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.883192 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.883224 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.884986 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.885153 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.888386 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.888522 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.888665 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.890280 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.891526 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.891728 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.895974 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.896285 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.902783 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.907605 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983295 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983340 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983364 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxgt\" (UniqueName: \"kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983384 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983424 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983591 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983613 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctgf\" (UniqueName: \"kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:52 crc kubenswrapper[4869]: I0218 05:51:52.983730 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085445 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085521 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctgf\" (UniqueName: \"kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085545 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085579 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085604 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085627 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxgt\" (UniqueName: \"kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085648 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.085705 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.086572 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.086901 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.086925 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.087135 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.087173 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.089443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.092612 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.103607 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxgt\" (UniqueName: \"kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt\") pod \"route-controller-manager-68dbdd84c8-h5jv2\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.105433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctgf\" (UniqueName: \"kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf\") pod \"controller-manager-5898c9b65b-msbl5\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.198911 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.209089 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.441366 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:51:53 crc kubenswrapper[4869]: W0218 05:51:53.444904 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c071cd_d304_4b43_8ff8_5d172c43b79e.slice/crio-85e064a582c9bde09ed8801349adc3edf8a388f74d50ae9de6307a8e94a04018 WatchSource:0}: Error finding container 85e064a582c9bde09ed8801349adc3edf8a388f74d50ae9de6307a8e94a04018: Status 404 returned error can't find the container with id 85e064a582c9bde09ed8801349adc3edf8a388f74d50ae9de6307a8e94a04018 Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.477345 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9ba96f-26e6-4870-9e59-9735f210eef3" path="/var/lib/kubelet/pods/5a9ba96f-26e6-4870-9e59-9735f210eef3/volumes" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.478185 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53da5b2-9e42-4160-8ed1-2600d9e76880" path="/var/lib/kubelet/pods/c53da5b2-9e42-4160-8ed1-2600d9e76880/volumes" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.598615 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.614882 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" event={"ID":"17c071cd-d304-4b43-8ff8-5d172c43b79e","Type":"ContainerStarted","Data":"735d4646e430112721b93939c4c01106bb9a6b4136d43fba4438337551a1ff0b"} Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.614966 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" event={"ID":"17c071cd-d304-4b43-8ff8-5d172c43b79e","Type":"ContainerStarted","Data":"85e064a582c9bde09ed8801349adc3edf8a388f74d50ae9de6307a8e94a04018"} Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.615014 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.616532 4869 patch_prober.go:28] interesting pod/route-controller-manager-68dbdd84c8-h5jv2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.616612 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.616702 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" event={"ID":"b6aa11b4-1065-474b-8025-b96742923df7","Type":"ContainerStarted","Data":"6980d875855d250ccc089eda87d29ed74ff23ca9ad32a15f482d0240165d0833"} Feb 18 05:51:53 crc kubenswrapper[4869]: I0218 05:51:53.630674 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" podStartSLOduration=2.630650524 podStartE2EDuration="2.630650524s" podCreationTimestamp="2026-02-18 05:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:53.629109286 +0000 UTC m=+210.798197518" watchObservedRunningTime="2026-02-18 05:51:53.630650524 +0000 UTC m=+210.799738756" Feb 18 05:51:54 crc kubenswrapper[4869]: I0218 05:51:54.623589 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" event={"ID":"b6aa11b4-1065-474b-8025-b96742923df7","Type":"ContainerStarted","Data":"52a29e730c1447c6c267821a4cbf86393732d53b6c6fc1eaa82e93336a868308"} Feb 18 05:51:54 crc kubenswrapper[4869]: I0218 05:51:54.624226 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:54 crc kubenswrapper[4869]: I0218 05:51:54.627672 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:51:54 crc kubenswrapper[4869]: I0218 05:51:54.628323 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:51:54 crc kubenswrapper[4869]: I0218 05:51:54.642883 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" podStartSLOduration=3.642867949 podStartE2EDuration="3.642867949s" podCreationTimestamp="2026-02-18 05:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:54.639560607 +0000 UTC m=+211.808648839" watchObservedRunningTime="2026-02-18 05:51:54.642867949 +0000 UTC m=+211.811956181" Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.133406 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.134016 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.134057 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.134591 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.134642 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80" gracePeriod=600 Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.702124 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80" exitCode=0 Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.702239 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80"} Feb 18 05:52:10 crc kubenswrapper[4869]: I0218 05:52:10.702479 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf"} Feb 18 05:52:31 crc kubenswrapper[4869]: I0218 05:52:31.572764 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:52:31 crc kubenswrapper[4869]: I0218 05:52:31.573582 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerName="route-controller-manager" containerID="cri-o://735d4646e430112721b93939c4c01106bb9a6b4136d43fba4438337551a1ff0b" gracePeriod=30 Feb 18 05:52:31 crc kubenswrapper[4869]: I0218 05:52:31.802544 4869 generic.go:334] "Generic (PLEG): container finished" podID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerID="735d4646e430112721b93939c4c01106bb9a6b4136d43fba4438337551a1ff0b" exitCode=0 Feb 18 05:52:31 crc kubenswrapper[4869]: I0218 05:52:31.802617 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" event={"ID":"17c071cd-d304-4b43-8ff8-5d172c43b79e","Type":"ContainerDied","Data":"735d4646e430112721b93939c4c01106bb9a6b4136d43fba4438337551a1ff0b"} Feb 18 05:52:31 crc kubenswrapper[4869]: I0218 05:52:31.909897 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.060529 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert\") pod \"17c071cd-d304-4b43-8ff8-5d172c43b79e\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.060589 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca\") pod \"17c071cd-d304-4b43-8ff8-5d172c43b79e\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.060653 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxgt\" (UniqueName: \"kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt\") pod \"17c071cd-d304-4b43-8ff8-5d172c43b79e\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.061415 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca" (OuterVolumeSpecName: "client-ca") pod "17c071cd-d304-4b43-8ff8-5d172c43b79e" (UID: "17c071cd-d304-4b43-8ff8-5d172c43b79e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.061640 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config\") pod \"17c071cd-d304-4b43-8ff8-5d172c43b79e\" (UID: \"17c071cd-d304-4b43-8ff8-5d172c43b79e\") " Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.061836 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.062291 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config" (OuterVolumeSpecName: "config") pod "17c071cd-d304-4b43-8ff8-5d172c43b79e" (UID: "17c071cd-d304-4b43-8ff8-5d172c43b79e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.066849 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17c071cd-d304-4b43-8ff8-5d172c43b79e" (UID: "17c071cd-d304-4b43-8ff8-5d172c43b79e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.072927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt" (OuterVolumeSpecName: "kube-api-access-5cxgt") pod "17c071cd-d304-4b43-8ff8-5d172c43b79e" (UID: "17c071cd-d304-4b43-8ff8-5d172c43b79e"). InnerVolumeSpecName "kube-api-access-5cxgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.162123 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxgt\" (UniqueName: \"kubernetes.io/projected/17c071cd-d304-4b43-8ff8-5d172c43b79e-kube-api-access-5cxgt\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.162425 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c071cd-d304-4b43-8ff8-5d172c43b79e-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.162436 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c071cd-d304-4b43-8ff8-5d172c43b79e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.809403 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" event={"ID":"17c071cd-d304-4b43-8ff8-5d172c43b79e","Type":"ContainerDied","Data":"85e064a582c9bde09ed8801349adc3edf8a388f74d50ae9de6307a8e94a04018"} Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.809469 4869 scope.go:117] "RemoveContainer" containerID="735d4646e430112721b93939c4c01106bb9a6b4136d43fba4438337551a1ff0b" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.809482 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.847854 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.854483 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dbdd84c8-h5jv2"] Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.912625 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz"] Feb 18 05:52:32 crc kubenswrapper[4869]: E0218 05:52:32.913109 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerName="route-controller-manager" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.913143 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerName="route-controller-manager" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.913276 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" containerName="route-controller-manager" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.913919 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.917338 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.917651 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.917716 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.917913 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.918136 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.918520 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:52:32 crc kubenswrapper[4869]: I0218 05:52:32.924639 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz"] Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.072006 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-config\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.072103 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-client-ca\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.072189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45tc\" (UniqueName: \"kubernetes.io/projected/26d8995f-dc71-4e81-aacd-76fce1bc831b-kube-api-access-w45tc\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.072340 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26d8995f-dc71-4e81-aacd-76fce1bc831b-serving-cert\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.173241 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45tc\" (UniqueName: \"kubernetes.io/projected/26d8995f-dc71-4e81-aacd-76fce1bc831b-kube-api-access-w45tc\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.173298 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26d8995f-dc71-4e81-aacd-76fce1bc831b-serving-cert\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.173355 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-config\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.173380 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-client-ca\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.174326 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-client-ca\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.175920 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d8995f-dc71-4e81-aacd-76fce1bc831b-config\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.178227 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26d8995f-dc71-4e81-aacd-76fce1bc831b-serving-cert\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.188190 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45tc\" (UniqueName: \"kubernetes.io/projected/26d8995f-dc71-4e81-aacd-76fce1bc831b-kube-api-access-w45tc\") pod \"route-controller-manager-7797d56bdb-j7ppz\" (UID: \"26d8995f-dc71-4e81-aacd-76fce1bc831b\") " pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.232594 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.476133 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c071cd-d304-4b43-8ff8-5d172c43b79e" path="/var/lib/kubelet/pods/17c071cd-d304-4b43-8ff8-5d172c43b79e/volumes" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.606234 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz"] Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.814668 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" event={"ID":"26d8995f-dc71-4e81-aacd-76fce1bc831b","Type":"ContainerStarted","Data":"bf04441650545b1dfa9b2a955a3203b4b6022acc8555eb4afb38bebb02483443"} Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.814776 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.814791 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" event={"ID":"26d8995f-dc71-4e81-aacd-76fce1bc831b","Type":"ContainerStarted","Data":"4fd0b96808de8d766a18a193be1f1662cfbc3f9d166cb50c6bbf637ff1a185fe"} Feb 18 05:52:33 crc kubenswrapper[4869]: I0218 05:52:33.830969 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" podStartSLOduration=2.830947883 podStartE2EDuration="2.830947883s" podCreationTimestamp="2026-02-18 05:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:52:33.826704648 +0000 UTC m=+250.995792890" watchObservedRunningTime="2026-02-18 05:52:33.830947883 +0000 UTC m=+251.000036115" Feb 18 05:52:34 crc kubenswrapper[4869]: I0218 05:52:34.138089 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7797d56bdb-j7ppz" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.032005 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jz4sv"] Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.033636 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.044454 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jz4sv"] Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.227879 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-certificates\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.227924 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-bound-sa-token\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.227962 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-trusted-ca\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.227995 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9cf561c-2b43-4b52-af29-188d48b6b441-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.228054 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9cf561c-2b43-4b52-af29-188d48b6b441-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.228090 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.228114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-tls\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.228146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sscws\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-kube-api-access-sscws\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.264662 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329577 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9cf561c-2b43-4b52-af29-188d48b6b441-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329642 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9cf561c-2b43-4b52-af29-188d48b6b441-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329687 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-tls\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329724 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sscws\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-kube-api-access-sscws\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329801 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-certificates\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329830 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-bound-sa-token\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.329917 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-trusted-ca\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.330542 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9cf561c-2b43-4b52-af29-188d48b6b441-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.331413 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-certificates\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.331909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9cf561c-2b43-4b52-af29-188d48b6b441-trusted-ca\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.342651 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9cf561c-2b43-4b52-af29-188d48b6b441-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.342654 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-registry-tls\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.346892 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sscws\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-kube-api-access-sscws\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.350970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9cf561c-2b43-4b52-af29-188d48b6b441-bound-sa-token\") pod \"image-registry-66df7c8f76-jz4sv\" (UID: \"f9cf561c-2b43-4b52-af29-188d48b6b441\") " pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:38 crc kubenswrapper[4869]: I0218 05:52:38.649051 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:39 crc kubenswrapper[4869]: I0218 05:52:39.051854 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jz4sv"] Feb 18 05:52:39 crc kubenswrapper[4869]: I0218 05:52:39.846835 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" event={"ID":"f9cf561c-2b43-4b52-af29-188d48b6b441","Type":"ContainerStarted","Data":"f36e8158a07c02b4168e25946e5965c0ec40ff3db0de9652345ee114ed12d5a9"} Feb 18 05:52:39 crc kubenswrapper[4869]: I0218 05:52:39.846888 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" event={"ID":"f9cf561c-2b43-4b52-af29-188d48b6b441","Type":"ContainerStarted","Data":"8f0653db329250cdfe7f817743535bc7c1a3d24b59dcbb20feb700e98302240b"} Feb 18 05:52:39 crc kubenswrapper[4869]: I0218 05:52:39.846988 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:39 crc kubenswrapper[4869]: I0218 05:52:39.868233 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" podStartSLOduration=1.868217196 podStartE2EDuration="1.868217196s" podCreationTimestamp="2026-02-18 05:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:52:39.863999121 +0000 UTC m=+257.033087393" watchObservedRunningTime="2026-02-18 05:52:39.868217196 +0000 UTC m=+257.037305428" Feb 18 05:52:51 crc kubenswrapper[4869]: I0218 05:52:51.656100 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:52:51 crc kubenswrapper[4869]: I0218 05:52:51.657121 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" podUID="b6aa11b4-1065-474b-8025-b96742923df7" containerName="controller-manager" containerID="cri-o://52a29e730c1447c6c267821a4cbf86393732d53b6c6fc1eaa82e93336a868308" gracePeriod=30 Feb 18 05:52:51 crc kubenswrapper[4869]: I0218 05:52:51.903767 4869 generic.go:334] "Generic (PLEG): container finished" podID="b6aa11b4-1065-474b-8025-b96742923df7" containerID="52a29e730c1447c6c267821a4cbf86393732d53b6c6fc1eaa82e93336a868308" exitCode=0 Feb 18 05:52:51 crc kubenswrapper[4869]: I0218 05:52:51.904122 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" event={"ID":"b6aa11b4-1065-474b-8025-b96742923df7","Type":"ContainerDied","Data":"52a29e730c1447c6c267821a4cbf86393732d53b6c6fc1eaa82e93336a868308"} Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.040178 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.124476 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert\") pod \"b6aa11b4-1065-474b-8025-b96742923df7\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.124543 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xctgf\" (UniqueName: \"kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf\") pod \"b6aa11b4-1065-474b-8025-b96742923df7\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.124568 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles\") pod \"b6aa11b4-1065-474b-8025-b96742923df7\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.124599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config\") pod \"b6aa11b4-1065-474b-8025-b96742923df7\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.124614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca\") pod \"b6aa11b4-1065-474b-8025-b96742923df7\" (UID: \"b6aa11b4-1065-474b-8025-b96742923df7\") " Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.125700 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6aa11b4-1065-474b-8025-b96742923df7" (UID: "b6aa11b4-1065-474b-8025-b96742923df7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.125820 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6aa11b4-1065-474b-8025-b96742923df7" (UID: "b6aa11b4-1065-474b-8025-b96742923df7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.125929 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config" (OuterVolumeSpecName: "config") pod "b6aa11b4-1065-474b-8025-b96742923df7" (UID: "b6aa11b4-1065-474b-8025-b96742923df7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.130854 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf" (OuterVolumeSpecName: "kube-api-access-xctgf") pod "b6aa11b4-1065-474b-8025-b96742923df7" (UID: "b6aa11b4-1065-474b-8025-b96742923df7"). InnerVolumeSpecName "kube-api-access-xctgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.131425 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6aa11b4-1065-474b-8025-b96742923df7" (UID: "b6aa11b4-1065-474b-8025-b96742923df7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.225624 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa11b4-1065-474b-8025-b96742923df7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.225909 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xctgf\" (UniqueName: \"kubernetes.io/projected/b6aa11b4-1065-474b-8025-b96742923df7-kube-api-access-xctgf\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.225992 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.226055 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.226115 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa11b4-1065-474b-8025-b96742923df7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.909989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" event={"ID":"b6aa11b4-1065-474b-8025-b96742923df7","Type":"ContainerDied","Data":"6980d875855d250ccc089eda87d29ed74ff23ca9ad32a15f482d0240165d0833"} Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.910054 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5898c9b65b-msbl5" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.910350 4869 scope.go:117] "RemoveContainer" containerID="52a29e730c1447c6c267821a4cbf86393732d53b6c6fc1eaa82e93336a868308" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.919562 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk"] Feb 18 05:52:52 crc kubenswrapper[4869]: E0218 05:52:52.919778 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6aa11b4-1065-474b-8025-b96742923df7" containerName="controller-manager" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.919790 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6aa11b4-1065-474b-8025-b96742923df7" containerName="controller-manager" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.919893 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6aa11b4-1065-474b-8025-b96742923df7" containerName="controller-manager" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.920209 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.922350 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.922426 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.924963 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.928557 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.928722 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.930504 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.932015 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.934034 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk"] Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.963001 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:52:52 crc kubenswrapper[4869]: I0218 05:52:52.967323 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5898c9b65b-msbl5"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.038366 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-client-ca\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.038637 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52tc\" (UniqueName: \"kubernetes.io/projected/c27dc573-cd91-4546-a5a9-bdad7d4cb101-kube-api-access-w52tc\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.038827 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c27dc573-cd91-4546-a5a9-bdad7d4cb101-serving-cert\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.038921 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-proxy-ca-bundles\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.039011 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-config\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.140521 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w52tc\" (UniqueName: \"kubernetes.io/projected/c27dc573-cd91-4546-a5a9-bdad7d4cb101-kube-api-access-w52tc\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.140597 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c27dc573-cd91-4546-a5a9-bdad7d4cb101-serving-cert\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.140616 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-proxy-ca-bundles\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.140636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-config\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.140657 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-client-ca\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.142269 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-client-ca\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.142378 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-proxy-ca-bundles\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.142696 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27dc573-cd91-4546-a5a9-bdad7d4cb101-config\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.147439 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c27dc573-cd91-4546-a5a9-bdad7d4cb101-serving-cert\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.156054 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52tc\" (UniqueName: \"kubernetes.io/projected/c27dc573-cd91-4546-a5a9-bdad7d4cb101-kube-api-access-w52tc\") pod \"controller-manager-7bcc84d69c-cnxlk\" (UID: \"c27dc573-cd91-4546-a5a9-bdad7d4cb101\") " pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.251935 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.417169 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.477946 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6aa11b4-1065-474b-8025-b96742923df7" path="/var/lib/kubelet/pods/b6aa11b4-1065-474b-8025-b96742923df7/volumes" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.566862 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.567082 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lfkdw" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="registry-server" containerID="cri-o://a53e6b34a781e705ae995355742cd7f11c11f3a95ce858c9c81d93f338478a22" gracePeriod=30 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.577183 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.578166 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2q2rk" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="registry-server" containerID="cri-o://24967964512e784c819a2c86b1cc46cded4a3ca53a682d534ae470b55ec500fa" gracePeriod=30 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.586245 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.586466 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" containerID="cri-o://c2309e339033914a8062d10b3a5234bd482515f15c25d7d37e52e6e46ddb18d0" gracePeriod=30 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.597706 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.597975 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gv5qc" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="registry-server" containerID="cri-o://2399554798dd2d9194f369a5e7ae93b0538568ed170a25b7995287ab0b0874ec" gracePeriod=30 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.617793 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.618117 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97mgq" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="registry-server" containerID="cri-o://41884e1e8621c03377bc5b38aa152e12413bd778c0fe44f1f85aad5a28f65a51" gracePeriod=30 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.618784 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-997j2"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.619544 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.636030 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-997j2"] Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.747195 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.747244 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.747343 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m7s\" (UniqueName: \"kubernetes.io/projected/89d643e7-cad7-4856-9d82-c0370e1f20e5-kube-api-access-q4m7s\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.848644 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m7s\" (UniqueName: \"kubernetes.io/projected/89d643e7-cad7-4856-9d82-c0370e1f20e5-kube-api-access-q4m7s\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.848721 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.848761 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.851260 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.854981 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/89d643e7-cad7-4856-9d82-c0370e1f20e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.876443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m7s\" (UniqueName: \"kubernetes.io/projected/89d643e7-cad7-4856-9d82-c0370e1f20e5-kube-api-access-q4m7s\") pod \"marketplace-operator-79b997595-997j2\" (UID: \"89d643e7-cad7-4856-9d82-c0370e1f20e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.877885 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.925170 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" event={"ID":"c27dc573-cd91-4546-a5a9-bdad7d4cb101","Type":"ContainerStarted","Data":"fc2a4d87364bd17a9a9d765e5eef7a33ccc6e5871392ff780ee0d60889be5b0e"} Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.925223 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" event={"ID":"c27dc573-cd91-4546-a5a9-bdad7d4cb101","Type":"ContainerStarted","Data":"3175e5ec71c93f99696673654ec1b1a9f1ced49b6cec0b453e621e37a2fb2c8e"} Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.925819 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.935550 4869 generic.go:334] "Generic (PLEG): container finished" podID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerID="2399554798dd2d9194f369a5e7ae93b0538568ed170a25b7995287ab0b0874ec" exitCode=0 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.935631 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerDied","Data":"2399554798dd2d9194f369a5e7ae93b0538568ed170a25b7995287ab0b0874ec"} Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.947038 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerID="24967964512e784c819a2c86b1cc46cded4a3ca53a682d534ae470b55ec500fa" exitCode=0 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.947158 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerDied","Data":"24967964512e784c819a2c86b1cc46cded4a3ca53a682d534ae470b55ec500fa"} Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.965709 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.976058 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bcc84d69c-cnxlk" podStartSLOduration=2.9760326089999998 podStartE2EDuration="2.976032609s" podCreationTimestamp="2026-02-18 05:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:52:53.952958282 +0000 UTC m=+271.122046524" watchObservedRunningTime="2026-02-18 05:52:53.976032609 +0000 UTC m=+271.145120841" Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.978760 4869 generic.go:334] "Generic (PLEG): container finished" podID="ac953f18-4fbf-455f-b229-a51977890aa6" containerID="41884e1e8621c03377bc5b38aa152e12413bd778c0fe44f1f85aad5a28f65a51" exitCode=0 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.978823 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerDied","Data":"41884e1e8621c03377bc5b38aa152e12413bd778c0fe44f1f85aad5a28f65a51"} Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.987954 4869 generic.go:334] "Generic (PLEG): container finished" podID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerID="a53e6b34a781e705ae995355742cd7f11c11f3a95ce858c9c81d93f338478a22" exitCode=0 Feb 18 05:52:53 crc kubenswrapper[4869]: I0218 05:52:53.988022 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerDied","Data":"a53e6b34a781e705ae995355742cd7f11c11f3a95ce858c9c81d93f338478a22"} Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.011034 4869 generic.go:334] "Generic (PLEG): container finished" podID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerID="c2309e339033914a8062d10b3a5234bd482515f15c25d7d37e52e6e46ddb18d0" exitCode=0 Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.011087 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerDied","Data":"c2309e339033914a8062d10b3a5234bd482515f15c25d7d37e52e6e46ddb18d0"} Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.011119 4869 scope.go:117] "RemoveContainer" containerID="95da8d9bc2115bb8279152a8d9f1d2cb19e38f7fb924557c084d8af071fe7c8f" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.018299 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.114418 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.133209 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.159405 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k692c\" (UniqueName: \"kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c\") pod \"f16542c9-445d-4f1a-883d-0a5306a6e0da\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.159464 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities\") pod \"f16542c9-445d-4f1a-883d-0a5306a6e0da\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.159500 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content\") pod \"f16542c9-445d-4f1a-883d-0a5306a6e0da\" (UID: \"f16542c9-445d-4f1a-883d-0a5306a6e0da\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.164206 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities" (OuterVolumeSpecName: "utilities") pod "f16542c9-445d-4f1a-883d-0a5306a6e0da" (UID: "f16542c9-445d-4f1a-883d-0a5306a6e0da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.167669 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c" (OuterVolumeSpecName: "kube-api-access-k692c") pod "f16542c9-445d-4f1a-883d-0a5306a6e0da" (UID: "f16542c9-445d-4f1a-883d-0a5306a6e0da"). InnerVolumeSpecName "kube-api-access-k692c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.227598 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.229927 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.236180 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f16542c9-445d-4f1a-883d-0a5306a6e0da" (UID: "f16542c9-445d-4f1a-883d-0a5306a6e0da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.260908 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhl9s\" (UniqueName: \"kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s\") pod \"742847e6-6cb2-458e-8a75-2a76a970c4a4\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.260994 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities\") pod \"1fa8c341-03cc-49d4-8793-88cec4a8444d\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261064 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content\") pod \"1fa8c341-03cc-49d4-8793-88cec4a8444d\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261094 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities\") pod \"742847e6-6cb2-458e-8a75-2a76a970c4a4\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261118 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content\") pod \"742847e6-6cb2-458e-8a75-2a76a970c4a4\" (UID: \"742847e6-6cb2-458e-8a75-2a76a970c4a4\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261199 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbfq9\" (UniqueName: \"kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9\") pod \"1fa8c341-03cc-49d4-8793-88cec4a8444d\" (UID: \"1fa8c341-03cc-49d4-8793-88cec4a8444d\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261476 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k692c\" (UniqueName: \"kubernetes.io/projected/f16542c9-445d-4f1a-883d-0a5306a6e0da-kube-api-access-k692c\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261493 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.261504 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16542c9-445d-4f1a-883d-0a5306a6e0da-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.262605 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities" (OuterVolumeSpecName: "utilities") pod "742847e6-6cb2-458e-8a75-2a76a970c4a4" (UID: "742847e6-6cb2-458e-8a75-2a76a970c4a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.262885 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities" (OuterVolumeSpecName: "utilities") pod "1fa8c341-03cc-49d4-8793-88cec4a8444d" (UID: "1fa8c341-03cc-49d4-8793-88cec4a8444d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.265922 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9" (OuterVolumeSpecName: "kube-api-access-lbfq9") pod "1fa8c341-03cc-49d4-8793-88cec4a8444d" (UID: "1fa8c341-03cc-49d4-8793-88cec4a8444d"). InnerVolumeSpecName "kube-api-access-lbfq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.275003 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s" (OuterVolumeSpecName: "kube-api-access-mhl9s") pod "742847e6-6cb2-458e-8a75-2a76a970c4a4" (UID: "742847e6-6cb2-458e-8a75-2a76a970c4a4"). InnerVolumeSpecName "kube-api-access-mhl9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.316152 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "742847e6-6cb2-458e-8a75-2a76a970c4a4" (UID: "742847e6-6cb2-458e-8a75-2a76a970c4a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.317364 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fa8c341-03cc-49d4-8793-88cec4a8444d" (UID: "1fa8c341-03cc-49d4-8793-88cec4a8444d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362765 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrgg\" (UniqueName: \"kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg\") pod \"ac953f18-4fbf-455f-b229-a51977890aa6\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tjpw\" (UniqueName: \"kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw\") pod \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362835 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content\") pod \"ac953f18-4fbf-455f-b229-a51977890aa6\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362912 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") pod \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362964 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics\") pod \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\" (UID: \"a5c8a86f-e3a2-4088-9839-386b9dc56d03\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.362990 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities\") pod \"ac953f18-4fbf-455f-b229-a51977890aa6\" (UID: \"ac953f18-4fbf-455f-b229-a51977890aa6\") " Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363238 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbfq9\" (UniqueName: \"kubernetes.io/projected/1fa8c341-03cc-49d4-8793-88cec4a8444d-kube-api-access-lbfq9\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363261 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhl9s\" (UniqueName: \"kubernetes.io/projected/742847e6-6cb2-458e-8a75-2a76a970c4a4-kube-api-access-mhl9s\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363273 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363284 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa8c341-03cc-49d4-8793-88cec4a8444d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363300 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363310 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/742847e6-6cb2-458e-8a75-2a76a970c4a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.363665 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a5c8a86f-e3a2-4088-9839-386b9dc56d03" (UID: "a5c8a86f-e3a2-4088-9839-386b9dc56d03"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.364045 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities" (OuterVolumeSpecName: "utilities") pod "ac953f18-4fbf-455f-b229-a51977890aa6" (UID: "ac953f18-4fbf-455f-b229-a51977890aa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.365677 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw" (OuterVolumeSpecName: "kube-api-access-4tjpw") pod "a5c8a86f-e3a2-4088-9839-386b9dc56d03" (UID: "a5c8a86f-e3a2-4088-9839-386b9dc56d03"). InnerVolumeSpecName "kube-api-access-4tjpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.367222 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a5c8a86f-e3a2-4088-9839-386b9dc56d03" (UID: "a5c8a86f-e3a2-4088-9839-386b9dc56d03"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.367931 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg" (OuterVolumeSpecName: "kube-api-access-hxrgg") pod "ac953f18-4fbf-455f-b229-a51977890aa6" (UID: "ac953f18-4fbf-455f-b229-a51977890aa6"). InnerVolumeSpecName "kube-api-access-hxrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.433130 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-997j2"] Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.464357 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.464398 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.464413 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrgg\" (UniqueName: \"kubernetes.io/projected/ac953f18-4fbf-455f-b229-a51977890aa6-kube-api-access-hxrgg\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.464426 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tjpw\" (UniqueName: \"kubernetes.io/projected/a5c8a86f-e3a2-4088-9839-386b9dc56d03-kube-api-access-4tjpw\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.464438 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5c8a86f-e3a2-4088-9839-386b9dc56d03-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.488830 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac953f18-4fbf-455f-b229-a51977890aa6" (UID: "ac953f18-4fbf-455f-b229-a51977890aa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:54 crc kubenswrapper[4869]: I0218 05:52:54.566920 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac953f18-4fbf-455f-b229-a51977890aa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.020088 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2q2rk" event={"ID":"1fa8c341-03cc-49d4-8793-88cec4a8444d","Type":"ContainerDied","Data":"894f49cc3a135fcbce848afbd9f95fa99017a4f61510c627b93ffa162abc422f"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.020524 4869 scope.go:117] "RemoveContainer" containerID="24967964512e784c819a2c86b1cc46cded4a3ca53a682d534ae470b55ec500fa" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.020100 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2q2rk" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.023002 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97mgq" event={"ID":"ac953f18-4fbf-455f-b229-a51977890aa6","Type":"ContainerDied","Data":"a9447bd55b8ea888f696ae29cb06c13df15bd719b255085f7c6854d5f8fc995a"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.023041 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97mgq" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.025519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfkdw" event={"ID":"f16542c9-445d-4f1a-883d-0a5306a6e0da","Type":"ContainerDied","Data":"d082db128eb59e3e029736eed6babda3794910c03f08794ec85ed791ea932299"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.025588 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfkdw" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.029859 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" event={"ID":"a5c8a86f-e3a2-4088-9839-386b9dc56d03","Type":"ContainerDied","Data":"62fa0171431dc588dc09904620132917d479adfcbe5e5987bc671504556b7cb0"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.029860 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zd46x" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.032460 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gv5qc" event={"ID":"742847e6-6cb2-458e-8a75-2a76a970c4a4","Type":"ContainerDied","Data":"bdea75f28fab35f4c5ae8afec9b13e330fbdae6b37b02951b8148ead82354d94"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.032559 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gv5qc" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.038099 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" event={"ID":"89d643e7-cad7-4856-9d82-c0370e1f20e5","Type":"ContainerStarted","Data":"f66f1ff8546752a0e8f79f9c2737aa74aa25aa1bfff00121c6c3047c54e5cdc1"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.038134 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" event={"ID":"89d643e7-cad7-4856-9d82-c0370e1f20e5","Type":"ContainerStarted","Data":"ce6f4fd5f595b5a1b404b0605b6b3e74213f74e734e2460e7667d07901ad8f1a"} Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.039025 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.044139 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.054178 4869 scope.go:117] "RemoveContainer" containerID="f79414943cc9d27aa0dc7c1953eb659343b2b7594344e830b0fb579fde25488e" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.088664 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.103735 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2q2rk"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.104079 4869 scope.go:117] "RemoveContainer" containerID="302ff42c06139a942207345263eb7eb7bf5f94c64767e7314442cc6715e7062c" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.109436 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-997j2" podStartSLOduration=2.109412083 podStartE2EDuration="2.109412083s" podCreationTimestamp="2026-02-18 05:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:52:55.102433229 +0000 UTC m=+272.271521461" watchObservedRunningTime="2026-02-18 05:52:55.109412083 +0000 UTC m=+272.278500335" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.131420 4869 scope.go:117] "RemoveContainer" containerID="41884e1e8621c03377bc5b38aa152e12413bd778c0fe44f1f85aad5a28f65a51" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.137217 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.152730 4869 scope.go:117] "RemoveContainer" containerID="b04a039a266aa65388efee3e719f3d075106eec8f2ed018a73a292ce9d4b6d23" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.154226 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97mgq"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.167992 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.173249 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gv5qc"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.202142 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.202810 4869 scope.go:117] "RemoveContainer" containerID="c22247b6183532b3afbb76c0bc237896949d16653ffa4bb10af14ddce883118e" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.207138 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lfkdw"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.214981 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.216937 4869 scope.go:117] "RemoveContainer" containerID="a53e6b34a781e705ae995355742cd7f11c11f3a95ce858c9c81d93f338478a22" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.221908 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zd46x"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.247294 4869 scope.go:117] "RemoveContainer" containerID="dad91b736972020acf87743d6ce207a6766cc797e2f7ef569401108ddbc8d742" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.263162 4869 scope.go:117] "RemoveContainer" containerID="efe9e14c1822d9fedf3185cd7ce4ff3e8a94bf07d16835e5684c013d58efa762" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.287948 4869 scope.go:117] "RemoveContainer" containerID="c2309e339033914a8062d10b3a5234bd482515f15c25d7d37e52e6e46ddb18d0" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.311314 4869 scope.go:117] "RemoveContainer" containerID="2399554798dd2d9194f369a5e7ae93b0538568ed170a25b7995287ab0b0874ec" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.323067 4869 scope.go:117] "RemoveContainer" containerID="9ed821e90ff3031a5e28548b5141d3a65994c73c6dc08852c06fb15628f1cbcb" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.338670 4869 scope.go:117] "RemoveContainer" containerID="9177c53668eb082dd00b3173b50e13223f1fec106e37d65b051ef8a45f74cbcb" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.477301 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" path="/var/lib/kubelet/pods/1fa8c341-03cc-49d4-8793-88cec4a8444d/volumes" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.478065 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" path="/var/lib/kubelet/pods/742847e6-6cb2-458e-8a75-2a76a970c4a4/volumes" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.478879 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" path="/var/lib/kubelet/pods/a5c8a86f-e3a2-4088-9839-386b9dc56d03/volumes" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.479902 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" path="/var/lib/kubelet/pods/ac953f18-4fbf-455f-b229-a51977890aa6/volumes" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.480733 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" path="/var/lib/kubelet/pods/f16542c9-445d-4f1a-883d-0a5306a6e0da/volumes" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785539 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rnzzv"] Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785734 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785763 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785775 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785780 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785790 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785796 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785806 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785815 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785825 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785832 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785842 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785849 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785858 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785866 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785874 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785880 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785887 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785893 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785902 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785908 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785919 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785926 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785936 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785943 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="extract-content" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785951 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785958 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: E0218 05:52:55.785967 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.785974 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="extract-utilities" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786077 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac953f18-4fbf-455f-b229-a51977890aa6" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786091 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16542c9-445d-4f1a-883d-0a5306a6e0da" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786102 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786115 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="742847e6-6cb2-458e-8a75-2a76a970c4a4" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786128 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c8a86f-e3a2-4088-9839-386b9dc56d03" containerName="marketplace-operator" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786137 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa8c341-03cc-49d4-8793-88cec4a8444d" containerName="registry-server" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.786974 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.789505 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.802275 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-catalog-content\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.802325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4lb\" (UniqueName: \"kubernetes.io/projected/d26ef514-863b-48dd-8576-d68036b43bf6-kube-api-access-bh4lb\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.802438 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-utilities\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.802571 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnzzv"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.904138 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-utilities\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.904222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-catalog-content\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.904256 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4lb\" (UniqueName: \"kubernetes.io/projected/d26ef514-863b-48dd-8576-d68036b43bf6-kube-api-access-bh4lb\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.904842 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-utilities\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.905002 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ef514-863b-48dd-8576-d68036b43bf6-catalog-content\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.925331 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4lb\" (UniqueName: \"kubernetes.io/projected/d26ef514-863b-48dd-8576-d68036b43bf6-kube-api-access-bh4lb\") pod \"redhat-marketplace-rnzzv\" (UID: \"d26ef514-863b-48dd-8576-d68036b43bf6\") " pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.980053 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d6h99"] Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.981231 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.989431 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:52:55 crc kubenswrapper[4869]: I0218 05:52:55.993994 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d6h99"] Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.006078 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2m6l\" (UniqueName: \"kubernetes.io/projected/7d008101-ce8c-46ba-986d-a6c77a268c0b-kube-api-access-f2m6l\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.006247 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-utilities\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.006307 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-catalog-content\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.107960 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-utilities\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.108642 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-catalog-content\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.108714 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-utilities\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.108915 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2m6l\" (UniqueName: \"kubernetes.io/projected/7d008101-ce8c-46ba-986d-a6c77a268c0b-kube-api-access-f2m6l\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.108953 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d008101-ce8c-46ba-986d-a6c77a268c0b-catalog-content\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.117718 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.126509 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2m6l\" (UniqueName: \"kubernetes.io/projected/7d008101-ce8c-46ba-986d-a6c77a268c0b-kube-api-access-f2m6l\") pod \"certified-operators-d6h99\" (UID: \"7d008101-ce8c-46ba-986d-a6c77a268c0b\") " pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.302209 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.519776 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnzzv"] Feb 18 05:52:56 crc kubenswrapper[4869]: I0218 05:52:56.711345 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d6h99"] Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.058726 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d008101-ce8c-46ba-986d-a6c77a268c0b" containerID="3abc25f870b71961c981625b1437445afef701f517f691374517515915b16250" exitCode=0 Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.058786 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6h99" event={"ID":"7d008101-ce8c-46ba-986d-a6c77a268c0b","Type":"ContainerDied","Data":"3abc25f870b71961c981625b1437445afef701f517f691374517515915b16250"} Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.059145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6h99" event={"ID":"7d008101-ce8c-46ba-986d-a6c77a268c0b","Type":"ContainerStarted","Data":"8d0efa85813383802e74901f8bdc2e94b999f144b562ad3d9abbe3c50eed9aaa"} Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.061381 4869 generic.go:334] "Generic (PLEG): container finished" podID="d26ef514-863b-48dd-8576-d68036b43bf6" containerID="392f631cc58d7eb16d0a7493dc1065f6038a72764cac96ddc22bac3bdad212c1" exitCode=0 Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.061934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnzzv" event={"ID":"d26ef514-863b-48dd-8576-d68036b43bf6","Type":"ContainerDied","Data":"392f631cc58d7eb16d0a7493dc1065f6038a72764cac96ddc22bac3bdad212c1"} Feb 18 05:52:57 crc kubenswrapper[4869]: I0218 05:52:57.061996 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnzzv" event={"ID":"d26ef514-863b-48dd-8576-d68036b43bf6","Type":"ContainerStarted","Data":"fb6ac7928f134049133beaa3776df3ce2ea7bca63fa54ce34ab14df70dc41b2a"} Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.069300 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6h99" event={"ID":"7d008101-ce8c-46ba-986d-a6c77a268c0b","Type":"ContainerStarted","Data":"d9953efc885962ef7acf43f255f16dcb2efdcf90c063c96e2c1b61a0cb9bc50d"} Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.071007 4869 generic.go:334] "Generic (PLEG): container finished" podID="d26ef514-863b-48dd-8576-d68036b43bf6" containerID="bb629dd1fec3f79faf877ffbb788ec13d1707c1089442a2d11c7de6b3107e9b3" exitCode=0 Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.071045 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnzzv" event={"ID":"d26ef514-863b-48dd-8576-d68036b43bf6","Type":"ContainerDied","Data":"bb629dd1fec3f79faf877ffbb788ec13d1707c1089442a2d11c7de6b3107e9b3"} Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.181178 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.182344 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.184205 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.195467 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.239203 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.239279 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8r7s\" (UniqueName: \"kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.239325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.340601 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.340682 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8r7s\" (UniqueName: \"kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.340706 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.341255 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.341545 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.358465 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8r7s\" (UniqueName: \"kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s\") pod \"community-operators-7px69\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.379690 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6klp"] Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.380595 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.382113 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.390250 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6klp"] Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.441511 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-catalog-content\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.441560 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-utilities\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.441698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr5q2\" (UniqueName: \"kubernetes.io/projected/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-kube-api-access-kr5q2\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.498110 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.548581 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr5q2\" (UniqueName: \"kubernetes.io/projected/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-kube-api-access-kr5q2\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.548763 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-catalog-content\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.548825 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-utilities\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.550224 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-utilities\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.552081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-catalog-content\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.567032 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr5q2\" (UniqueName: \"kubernetes.io/projected/83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22-kube-api-access-kr5q2\") pod \"redhat-operators-j6klp\" (UID: \"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22\") " pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.654396 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jz4sv" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.696254 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.714891 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:52:58 crc kubenswrapper[4869]: I0218 05:52:58.891679 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.078268 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnzzv" event={"ID":"d26ef514-863b-48dd-8576-d68036b43bf6","Type":"ContainerStarted","Data":"d98319ad740a5b4d359ff41bc697b5eebde32d30c821a83999e77409ab35e8b7"} Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.079681 4869 generic.go:334] "Generic (PLEG): container finished" podID="26874db3-d22a-4b27-8691-df2a106444f4" containerID="015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c" exitCode=0 Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.079729 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerDied","Data":"015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c"} Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.079761 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerStarted","Data":"b7249fd5d31da43f885ab11aa72a409b634ebff1fcadd7ab8f94660091ddd117"} Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.082019 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d008101-ce8c-46ba-986d-a6c77a268c0b" containerID="d9953efc885962ef7acf43f255f16dcb2efdcf90c063c96e2c1b61a0cb9bc50d" exitCode=0 Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.082075 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6h99" event={"ID":"7d008101-ce8c-46ba-986d-a6c77a268c0b","Type":"ContainerDied","Data":"d9953efc885962ef7acf43f255f16dcb2efdcf90c063c96e2c1b61a0cb9bc50d"} Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.105239 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rnzzv" podStartSLOduration=2.485703781 podStartE2EDuration="4.105219916s" podCreationTimestamp="2026-02-18 05:52:55 +0000 UTC" firstStartedPulling="2026-02-18 05:52:57.063580252 +0000 UTC m=+274.232668484" lastFinishedPulling="2026-02-18 05:52:58.683096387 +0000 UTC m=+275.852184619" observedRunningTime="2026-02-18 05:52:59.099093463 +0000 UTC m=+276.268181705" watchObservedRunningTime="2026-02-18 05:52:59.105219916 +0000 UTC m=+276.274308148" Feb 18 05:52:59 crc kubenswrapper[4869]: I0218 05:52:59.150367 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6klp"] Feb 18 05:53:00 crc kubenswrapper[4869]: I0218 05:53:00.096920 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerStarted","Data":"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67"} Feb 18 05:53:00 crc kubenswrapper[4869]: I0218 05:53:00.099569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6h99" event={"ID":"7d008101-ce8c-46ba-986d-a6c77a268c0b","Type":"ContainerStarted","Data":"bae850fc8d57123bfbb1f66aa2a1a0a74d01bba2c89fb6e8eacf66965c295bd8"} Feb 18 05:53:00 crc kubenswrapper[4869]: I0218 05:53:00.101566 4869 generic.go:334] "Generic (PLEG): container finished" podID="83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22" containerID="85c97b54efe26f158e67a00332c8fe134a8cc72271aa27600033369ccc9ae0fc" exitCode=0 Feb 18 05:53:00 crc kubenswrapper[4869]: I0218 05:53:00.101617 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6klp" event={"ID":"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22","Type":"ContainerDied","Data":"85c97b54efe26f158e67a00332c8fe134a8cc72271aa27600033369ccc9ae0fc"} Feb 18 05:53:00 crc kubenswrapper[4869]: I0218 05:53:00.101641 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6klp" event={"ID":"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22","Type":"ContainerStarted","Data":"59673e47dbaaebab65e324c164abfcde26e237d12f28b980f09962cedf918d28"} Feb 18 05:53:01 crc kubenswrapper[4869]: I0218 05:53:01.107918 4869 generic.go:334] "Generic (PLEG): container finished" podID="26874db3-d22a-4b27-8691-df2a106444f4" containerID="f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67" exitCode=0 Feb 18 05:53:01 crc kubenswrapper[4869]: I0218 05:53:01.107989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerDied","Data":"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67"} Feb 18 05:53:01 crc kubenswrapper[4869]: I0218 05:53:01.115405 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6klp" event={"ID":"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22","Type":"ContainerStarted","Data":"86574317773ff0792acff0488d352500204f057e615bb7bce95228635e7476c5"} Feb 18 05:53:01 crc kubenswrapper[4869]: I0218 05:53:01.130094 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d6h99" podStartSLOduration=3.705810494 podStartE2EDuration="6.13007577s" podCreationTimestamp="2026-02-18 05:52:55 +0000 UTC" firstStartedPulling="2026-02-18 05:52:57.060521796 +0000 UTC m=+274.229610028" lastFinishedPulling="2026-02-18 05:52:59.484787072 +0000 UTC m=+276.653875304" observedRunningTime="2026-02-18 05:53:00.150001466 +0000 UTC m=+277.319089698" watchObservedRunningTime="2026-02-18 05:53:01.13007577 +0000 UTC m=+278.299164002" Feb 18 05:53:02 crc kubenswrapper[4869]: I0218 05:53:02.122174 4869 generic.go:334] "Generic (PLEG): container finished" podID="83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22" containerID="86574317773ff0792acff0488d352500204f057e615bb7bce95228635e7476c5" exitCode=0 Feb 18 05:53:02 crc kubenswrapper[4869]: I0218 05:53:02.122326 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6klp" event={"ID":"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22","Type":"ContainerDied","Data":"86574317773ff0792acff0488d352500204f057e615bb7bce95228635e7476c5"} Feb 18 05:53:02 crc kubenswrapper[4869]: I0218 05:53:02.124703 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerStarted","Data":"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719"} Feb 18 05:53:02 crc kubenswrapper[4869]: I0218 05:53:02.155716 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7px69" podStartSLOduration=1.721170131 podStartE2EDuration="4.155692763s" podCreationTimestamp="2026-02-18 05:52:58 +0000 UTC" firstStartedPulling="2026-02-18 05:52:59.080867008 +0000 UTC m=+276.249955240" lastFinishedPulling="2026-02-18 05:53:01.51538964 +0000 UTC m=+278.684477872" observedRunningTime="2026-02-18 05:53:02.152991595 +0000 UTC m=+279.322079827" watchObservedRunningTime="2026-02-18 05:53:02.155692763 +0000 UTC m=+279.324780995" Feb 18 05:53:03 crc kubenswrapper[4869]: I0218 05:53:03.131811 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6klp" event={"ID":"83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22","Type":"ContainerStarted","Data":"7cbc09403fb4aa4e4178b63ac6ed310e76f3c5e18e2e62cce62ffd50f474f71d"} Feb 18 05:53:03 crc kubenswrapper[4869]: I0218 05:53:03.150058 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6klp" podStartSLOduration=2.780613227 podStartE2EDuration="5.150041763s" podCreationTimestamp="2026-02-18 05:52:58 +0000 UTC" firstStartedPulling="2026-02-18 05:53:00.10289421 +0000 UTC m=+277.271982442" lastFinishedPulling="2026-02-18 05:53:02.472322746 +0000 UTC m=+279.641410978" observedRunningTime="2026-02-18 05:53:03.148511265 +0000 UTC m=+280.317599497" watchObservedRunningTime="2026-02-18 05:53:03.150041763 +0000 UTC m=+280.319129995" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.118927 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.119483 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.161489 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.200536 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rnzzv" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.303027 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.303886 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:53:06 crc kubenswrapper[4869]: I0218 05:53:06.339386 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:53:07 crc kubenswrapper[4869]: I0218 05:53:07.194210 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d6h99" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.498418 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.499103 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.540421 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.697487 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.697541 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:53:08 crc kubenswrapper[4869]: I0218 05:53:08.735617 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:53:09 crc kubenswrapper[4869]: I0218 05:53:09.202324 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7px69" Feb 18 05:53:09 crc kubenswrapper[4869]: I0218 05:53:09.203061 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6klp" Feb 18 05:53:11 crc kubenswrapper[4869]: I0218 05:53:11.044995 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" podUID="9742d031-8f05-438c-8028-700eb13042fe" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:53:23 crc kubenswrapper[4869]: I0218 05:53:23.262489 4869 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 18 05:53:23 crc kubenswrapper[4869]: I0218 05:53:23.765006 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" podUID="7bd4427e-4327-477c-a527-de6c4bf89088" containerName="registry" containerID="cri-o://165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40" gracePeriod=30 Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.199021 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.215953 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.216161 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.216277 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.219439 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.224536 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.243801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317406 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317473 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317571 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sncqj\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317594 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls\") pod \"7bd4427e-4327-477c-a527-de6c4bf89088\" (UID: \"7bd4427e-4327-477c-a527-de6c4bf89088\") " Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317777 4869 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bd4427e-4327-477c-a527-de6c4bf89088-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317790 4869 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317799 4869 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bd4427e-4327-477c-a527-de6c4bf89088-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.317977 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.320422 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.320573 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.322707 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj" (OuterVolumeSpecName: "kube-api-access-sncqj") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "kube-api-access-sncqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.325998 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7bd4427e-4327-477c-a527-de6c4bf89088" (UID: "7bd4427e-4327-477c-a527-de6c4bf89088"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.418670 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.418706 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sncqj\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-kube-api-access-sncqj\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.418718 4869 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bd4427e-4327-477c-a527-de6c4bf89088-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.418727 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bd4427e-4327-477c-a527-de6c4bf89088-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.512879 4869 generic.go:334] "Generic (PLEG): container finished" podID="7bd4427e-4327-477c-a527-de6c4bf89088" containerID="165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40" exitCode=0 Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.512924 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" event={"ID":"7bd4427e-4327-477c-a527-de6c4bf89088","Type":"ContainerDied","Data":"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40"} Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.512966 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" event={"ID":"7bd4427e-4327-477c-a527-de6c4bf89088","Type":"ContainerDied","Data":"01b66c0fb6f56292defad182a097338940db5c6684a023ba34afac5db0df3b05"} Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.512991 4869 scope.go:117] "RemoveContainer" containerID="165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.513161 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6rqpw" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.531392 4869 scope.go:117] "RemoveContainer" containerID="165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40" Feb 18 05:53:24 crc kubenswrapper[4869]: E0218 05:53:24.531876 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40\": container with ID starting with 165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40 not found: ID does not exist" containerID="165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.531909 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40"} err="failed to get container status \"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40\": rpc error: code = NotFound desc = could not find container \"165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40\": container with ID starting with 165e47f68f85e2a02c73a67eb7fbe81371e01a7fcb97182bab7d49f7622e6b40 not found: ID does not exist" Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.540129 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:53:24 crc kubenswrapper[4869]: I0218 05:53:24.544589 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6rqpw"] Feb 18 05:53:25 crc kubenswrapper[4869]: I0218 05:53:25.481152 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd4427e-4327-477c-a527-de6c4bf89088" path="/var/lib/kubelet/pods/7bd4427e-4327-477c-a527-de6c4bf89088/volumes" Feb 18 05:54:10 crc kubenswrapper[4869]: I0218 05:54:10.132816 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:54:10 crc kubenswrapper[4869]: I0218 05:54:10.133426 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:54:40 crc kubenswrapper[4869]: I0218 05:54:40.133318 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:54:40 crc kubenswrapper[4869]: I0218 05:54:40.133849 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:55:10 crc kubenswrapper[4869]: I0218 05:55:10.133054 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:55:10 crc kubenswrapper[4869]: I0218 05:55:10.133603 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:55:10 crc kubenswrapper[4869]: I0218 05:55:10.133648 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:55:10 crc kubenswrapper[4869]: I0218 05:55:10.134285 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:55:10 crc kubenswrapper[4869]: I0218 05:55:10.134347 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf" gracePeriod=600 Feb 18 05:55:11 crc kubenswrapper[4869]: I0218 05:55:11.057516 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf" exitCode=0 Feb 18 05:55:11 crc kubenswrapper[4869]: I0218 05:55:11.057890 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf"} Feb 18 05:55:11 crc kubenswrapper[4869]: I0218 05:55:11.058067 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105"} Feb 18 05:55:11 crc kubenswrapper[4869]: I0218 05:55:11.058098 4869 scope.go:117] "RemoveContainer" containerID="97067aaa66b615246c12637e475a0c048474fb0516f40d0cbe72ff5c54a9bc80" Feb 18 05:57:10 crc kubenswrapper[4869]: I0218 05:57:10.133309 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:57:10 crc kubenswrapper[4869]: I0218 05:57:10.133879 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.648780 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v52pk"] Feb 18 05:57:31 crc kubenswrapper[4869]: E0218 05:57:31.650941 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd4427e-4327-477c-a527-de6c4bf89088" containerName="registry" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.650973 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd4427e-4327-477c-a527-de6c4bf89088" containerName="registry" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.651160 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd4427e-4327-477c-a527-de6c4bf89088" containerName="registry" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.651561 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.654024 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sgwg2" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.655204 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.655698 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.661908 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v52pk"] Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.676634 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-xc26m"] Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.677546 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xc26m" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.679500 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-sfsk6" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.681485 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xc26m"] Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.700182 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zvwk2"] Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.700965 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.702544 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-h9td9" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.705319 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zvwk2"] Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.731829 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hxs\" (UniqueName: \"kubernetes.io/projected/6b66e86e-81f9-44f2-b711-ad17fc1504a6-kube-api-access-c4hxs\") pod \"cert-manager-cainjector-cf98fcc89-v52pk\" (UID: \"6b66e86e-81f9-44f2-b711-ad17fc1504a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.731890 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlwf\" (UniqueName: \"kubernetes.io/projected/6e6c32f7-24fc-4637-9607-bef3c0d85bb7-kube-api-access-bhlwf\") pod \"cert-manager-webhook-687f57d79b-zvwk2\" (UID: \"6e6c32f7-24fc-4637-9607-bef3c0d85bb7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.731917 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfhk\" (UniqueName: \"kubernetes.io/projected/2742a7dc-644d-4ede-be60-c014ffd5ad38-kube-api-access-cxfhk\") pod \"cert-manager-858654f9db-xc26m\" (UID: \"2742a7dc-644d-4ede-be60-c014ffd5ad38\") " pod="cert-manager/cert-manager-858654f9db-xc26m" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.833585 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlwf\" (UniqueName: \"kubernetes.io/projected/6e6c32f7-24fc-4637-9607-bef3c0d85bb7-kube-api-access-bhlwf\") pod \"cert-manager-webhook-687f57d79b-zvwk2\" (UID: \"6e6c32f7-24fc-4637-9607-bef3c0d85bb7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.833638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfhk\" (UniqueName: \"kubernetes.io/projected/2742a7dc-644d-4ede-be60-c014ffd5ad38-kube-api-access-cxfhk\") pod \"cert-manager-858654f9db-xc26m\" (UID: \"2742a7dc-644d-4ede-be60-c014ffd5ad38\") " pod="cert-manager/cert-manager-858654f9db-xc26m" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.833700 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hxs\" (UniqueName: \"kubernetes.io/projected/6b66e86e-81f9-44f2-b711-ad17fc1504a6-kube-api-access-c4hxs\") pod \"cert-manager-cainjector-cf98fcc89-v52pk\" (UID: \"6b66e86e-81f9-44f2-b711-ad17fc1504a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.852539 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hxs\" (UniqueName: \"kubernetes.io/projected/6b66e86e-81f9-44f2-b711-ad17fc1504a6-kube-api-access-c4hxs\") pod \"cert-manager-cainjector-cf98fcc89-v52pk\" (UID: \"6b66e86e-81f9-44f2-b711-ad17fc1504a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.854887 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlwf\" (UniqueName: \"kubernetes.io/projected/6e6c32f7-24fc-4637-9607-bef3c0d85bb7-kube-api-access-bhlwf\") pod \"cert-manager-webhook-687f57d79b-zvwk2\" (UID: \"6e6c32f7-24fc-4637-9607-bef3c0d85bb7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:31 crc kubenswrapper[4869]: I0218 05:57:31.855051 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfhk\" (UniqueName: \"kubernetes.io/projected/2742a7dc-644d-4ede-be60-c014ffd5ad38-kube-api-access-cxfhk\") pod \"cert-manager-858654f9db-xc26m\" (UID: \"2742a7dc-644d-4ede-be60-c014ffd5ad38\") " pod="cert-manager/cert-manager-858654f9db-xc26m" Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.022910 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.036081 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xc26m" Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.047139 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.250615 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v52pk"] Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.261664 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.501782 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xc26m"] Feb 18 05:57:32 crc kubenswrapper[4869]: W0218 05:57:32.507327 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2742a7dc_644d_4ede_be60_c014ffd5ad38.slice/crio-0c823c6f9b7b8b17eb7870b4f2acd718790417d46466e521a71588b666b182ad WatchSource:0}: Error finding container 0c823c6f9b7b8b17eb7870b4f2acd718790417d46466e521a71588b666b182ad: Status 404 returned error can't find the container with id 0c823c6f9b7b8b17eb7870b4f2acd718790417d46466e521a71588b666b182ad Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.508850 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zvwk2"] Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.916635 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xc26m" event={"ID":"2742a7dc-644d-4ede-be60-c014ffd5ad38","Type":"ContainerStarted","Data":"0c823c6f9b7b8b17eb7870b4f2acd718790417d46466e521a71588b666b182ad"} Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.917737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" event={"ID":"6b66e86e-81f9-44f2-b711-ad17fc1504a6","Type":"ContainerStarted","Data":"b85ec28a378dc534cf9f5751dfbc95c029a85533301c336cf1b188e2f9bf985c"} Feb 18 05:57:32 crc kubenswrapper[4869]: I0218 05:57:32.918719 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" event={"ID":"6e6c32f7-24fc-4637-9607-bef3c0d85bb7","Type":"ContainerStarted","Data":"2afc47e00217d50883a0ccd1cce9b941b14b99084472f3d68e6a33122e8bb1b4"} Feb 18 05:57:34 crc kubenswrapper[4869]: I0218 05:57:34.932188 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" event={"ID":"6b66e86e-81f9-44f2-b711-ad17fc1504a6","Type":"ContainerStarted","Data":"0fe47e71a15fd64422b8ef61fda890a9100f6341148294200bdda46a3bad0413"} Feb 18 05:57:34 crc kubenswrapper[4869]: I0218 05:57:34.947679 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v52pk" podStartSLOduration=1.932100454 podStartE2EDuration="3.947660555s" podCreationTimestamp="2026-02-18 05:57:31 +0000 UTC" firstStartedPulling="2026-02-18 05:57:32.261410696 +0000 UTC m=+549.430498928" lastFinishedPulling="2026-02-18 05:57:34.276970797 +0000 UTC m=+551.446059029" observedRunningTime="2026-02-18 05:57:34.944691303 +0000 UTC m=+552.113779535" watchObservedRunningTime="2026-02-18 05:57:34.947660555 +0000 UTC m=+552.116748787" Feb 18 05:57:35 crc kubenswrapper[4869]: I0218 05:57:35.938215 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" event={"ID":"6e6c32f7-24fc-4637-9607-bef3c0d85bb7","Type":"ContainerStarted","Data":"f025ca7ac12e607196fc9697d28498c325e1aadf3941691e3603701fd5e9268a"} Feb 18 05:57:35 crc kubenswrapper[4869]: I0218 05:57:35.938312 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:35 crc kubenswrapper[4869]: I0218 05:57:35.940982 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xc26m" event={"ID":"2742a7dc-644d-4ede-be60-c014ffd5ad38","Type":"ContainerStarted","Data":"fd74537820d5db0da5c295097f10ce8d4caef9cf875773513af6ab80c50bf557"} Feb 18 05:57:35 crc kubenswrapper[4869]: I0218 05:57:35.957977 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" podStartSLOduration=1.752787943 podStartE2EDuration="4.957955286s" podCreationTimestamp="2026-02-18 05:57:31 +0000 UTC" firstStartedPulling="2026-02-18 05:57:32.508828277 +0000 UTC m=+549.677916509" lastFinishedPulling="2026-02-18 05:57:35.71399562 +0000 UTC m=+552.883083852" observedRunningTime="2026-02-18 05:57:35.954939323 +0000 UTC m=+553.124027565" watchObservedRunningTime="2026-02-18 05:57:35.957955286 +0000 UTC m=+553.127043518" Feb 18 05:57:35 crc kubenswrapper[4869]: I0218 05:57:35.970908 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-xc26m" podStartSLOduration=1.7100857999999999 podStartE2EDuration="4.970890909s" podCreationTimestamp="2026-02-18 05:57:31 +0000 UTC" firstStartedPulling="2026-02-18 05:57:32.511466871 +0000 UTC m=+549.680555093" lastFinishedPulling="2026-02-18 05:57:35.77227196 +0000 UTC m=+552.941360202" observedRunningTime="2026-02-18 05:57:35.969838424 +0000 UTC m=+553.138926666" watchObservedRunningTime="2026-02-18 05:57:35.970890909 +0000 UTC m=+553.139979131" Feb 18 05:57:40 crc kubenswrapper[4869]: I0218 05:57:40.133520 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:57:40 crc kubenswrapper[4869]: I0218 05:57:40.134622 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.802501 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvs9q"] Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.802870 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-controller" containerID="cri-o://e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803280 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="sbdb" containerID="cri-o://0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803324 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="nbdb" containerID="cri-o://70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803355 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="northd" containerID="cri-o://41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803386 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803412 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-node" containerID="cri-o://4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.803443 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-acl-logging" containerID="cri-o://b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.831884 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovnkube-controller" containerID="cri-o://fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" gracePeriod=30 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.980793 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-acl-logging/0.log" Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.981644 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-controller/0.log" Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982290 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" exitCode=0 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982373 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" exitCode=0 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982386 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" exitCode=0 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982395 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" exitCode=0 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982433 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" exitCode=143 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982444 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" exitCode=143 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982781 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982796 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982815 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982827 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.982838 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.985152 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gzwj_87706ebb-d517-4b38-a542-d0afd6c8c9c2/kube-multus/0.log" Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.985205 4869 generic.go:334] "Generic (PLEG): container finished" podID="87706ebb-d517-4b38-a542-d0afd6c8c9c2" containerID="223dc0969547c24dfedc3291abfc2d364a072bd0ed0624e883a151464a9e90dc" exitCode=2 Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.985245 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gzwj" event={"ID":"87706ebb-d517-4b38-a542-d0afd6c8c9c2","Type":"ContainerDied","Data":"223dc0969547c24dfedc3291abfc2d364a072bd0ed0624e883a151464a9e90dc"} Feb 18 05:57:41 crc kubenswrapper[4869]: I0218 05:57:41.992135 4869 scope.go:117] "RemoveContainer" containerID="223dc0969547c24dfedc3291abfc2d364a072bd0ed0624e883a151464a9e90dc" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.050229 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zvwk2" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.113523 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-acl-logging/0.log" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.114145 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-controller/0.log" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.114764 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178484 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178557 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178597 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178668 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash" (OuterVolumeSpecName: "host-slash") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178669 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178715 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178776 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178847 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178883 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178928 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178974 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179009 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179093 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179174 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179209 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179246 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179286 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179342 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179394 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxc85\" (UniqueName: \"kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179437 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179471 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179513 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides\") pod \"a30082d3-c125-4e76-8ead-3633b967d974\" (UID: \"a30082d3-c125-4e76-8ead-3633b967d974\") " Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178786 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.178814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket" (OuterVolumeSpecName: "log-socket") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179544 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179586 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179651 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179918 4869 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179953 4869 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-slash\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179974 4869 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-log-socket\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.179993 4869 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180012 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180031 4869 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180047 4869 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180321 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180400 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180388 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180475 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log" (OuterVolumeSpecName: "node-log") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180493 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180531 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wmp96"] Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180521 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180552 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.180894 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="nbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180919 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="nbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.180936 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180947 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.180962 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180973 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.180984 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-node" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.180993 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-node" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.181006 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="sbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.181350 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.181474 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.181677 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.181014 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="sbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.182294 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-acl-logging" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182322 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-acl-logging" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.182342 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="northd" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182352 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="northd" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.182367 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kubecfg-setup" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182377 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kubecfg-setup" Feb 18 05:57:42 crc kubenswrapper[4869]: E0218 05:57:42.182388 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovnkube-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182399 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovnkube-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182569 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182593 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="nbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182608 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="sbdb" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182627 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovnkube-controller" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182641 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182654 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="ovn-acl-logging" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182666 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="northd" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.182682 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30082d3-c125-4e76-8ead-3633b967d974" containerName="kube-rbac-proxy-node" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.185987 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.187866 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.187919 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85" (OuterVolumeSpecName: "kube-api-access-xxc85") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "kube-api-access-xxc85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.194464 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a30082d3-c125-4e76-8ead-3633b967d974" (UID: "a30082d3-c125-4e76-8ead-3633b967d974"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281232 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-kubelet\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281283 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-etc-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281316 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281375 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-slash\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281395 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-log-socket\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281439 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-config\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281458 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b841aa-01cd-48f0-b079-ce75b438144c-ovn-node-metrics-cert\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281479 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281499 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-var-lib-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281519 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281538 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-systemd-units\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281551 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-env-overrides\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281564 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-node-log\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281581 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-systemd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281598 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t9hf\" (UniqueName: \"kubernetes.io/projected/e7b841aa-01cd-48f0-b079-ce75b438144c-kube-api-access-6t9hf\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281614 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-script-lib\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281630 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-bin\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281645 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-netns\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281665 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-ovn\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-netd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281728 4869 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281756 4869 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281766 4869 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281775 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281784 4869 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281793 4869 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-node-log\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281802 4869 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281809 4869 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281817 4869 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281826 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxc85\" (UniqueName: \"kubernetes.io/projected/a30082d3-c125-4e76-8ead-3633b967d974-kube-api-access-xxc85\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281834 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a30082d3-c125-4e76-8ead-3633b967d974-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281842 4869 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30082d3-c125-4e76-8ead-3633b967d974-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.281850 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a30082d3-c125-4e76-8ead-3633b967d974-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383214 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-systemd-units\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-env-overrides\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383285 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-node-log\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383302 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-systemd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383322 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t9hf\" (UniqueName: \"kubernetes.io/projected/e7b841aa-01cd-48f0-b079-ce75b438144c-kube-api-access-6t9hf\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383338 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-script-lib\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383339 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-systemd-units\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383397 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-node-log\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383403 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-bin\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-bin\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-systemd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383479 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-netns\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383514 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-ovn\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383553 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-netd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-kubelet\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383657 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-etc-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383696 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383712 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-slash\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383734 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-log-socket\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383771 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-config\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b841aa-01cd-48f0-b079-ce75b438144c-ovn-node-metrics-cert\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383820 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-var-lib-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383867 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383928 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383950 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-run-netns\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-ovn\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.383990 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-cni-netd\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384011 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-kubelet\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-etc-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384049 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-run-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384068 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-slash\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384087 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-log-socket\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384467 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-var-lib-openvswitch\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384538 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-script-lib\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384646 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-ovnkube-config\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.384980 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b841aa-01cd-48f0-b079-ce75b438144c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.385117 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b841aa-01cd-48f0-b079-ce75b438144c-env-overrides\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.388757 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b841aa-01cd-48f0-b079-ce75b438144c-ovn-node-metrics-cert\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.410636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t9hf\" (UniqueName: \"kubernetes.io/projected/e7b841aa-01cd-48f0-b079-ce75b438144c-kube-api-access-6t9hf\") pod \"ovnkube-node-wmp96\" (UID: \"e7b841aa-01cd-48f0-b079-ce75b438144c\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.523119 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:42 crc kubenswrapper[4869]: W0218 05:57:42.538565 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b841aa_01cd_48f0_b079_ce75b438144c.slice/crio-90bc7a7050d41bcb6dbb456ba5f82d872102917111930d57e93ed7cac608c84d WatchSource:0}: Error finding container 90bc7a7050d41bcb6dbb456ba5f82d872102917111930d57e93ed7cac608c84d: Status 404 returned error can't find the container with id 90bc7a7050d41bcb6dbb456ba5f82d872102917111930d57e93ed7cac608c84d Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.994473 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gzwj_87706ebb-d517-4b38-a542-d0afd6c8c9c2/kube-multus/0.log" Feb 18 05:57:42 crc kubenswrapper[4869]: I0218 05:57:42.995431 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gzwj" event={"ID":"87706ebb-d517-4b38-a542-d0afd6c8c9c2","Type":"ContainerStarted","Data":"9566c9e8cdb0f30cc2abe6cd3d88bcd2338320caa8124cec045ff59b952935ca"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.006470 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-acl-logging/0.log" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.007645 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvs9q_a30082d3-c125-4e76-8ead-3633b967d974/ovn-controller/0.log" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008181 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" exitCode=0 Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008252 4869 generic.go:334] "Generic (PLEG): container finished" podID="a30082d3-c125-4e76-8ead-3633b967d974" containerID="70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" exitCode=0 Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008271 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008376 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008508 4869 scope.go:117] "RemoveContainer" containerID="fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008513 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.008669 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvs9q" event={"ID":"a30082d3-c125-4e76-8ead-3633b967d974","Type":"ContainerDied","Data":"227b27eeb303280cfbe417c3902c92fda4c4946cf1a5d23602243928aadbedf0"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.010933 4869 generic.go:334] "Generic (PLEG): container finished" podID="e7b841aa-01cd-48f0-b079-ce75b438144c" containerID="2c73580a6a2a1a9b21bb09d28e6346e1431921f18874d81519f5d2e4baa75d1a" exitCode=0 Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.011035 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerDied","Data":"2c73580a6a2a1a9b21bb09d28e6346e1431921f18874d81519f5d2e4baa75d1a"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.011089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"90bc7a7050d41bcb6dbb456ba5f82d872102917111930d57e93ed7cac608c84d"} Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.050401 4869 scope.go:117] "RemoveContainer" containerID="0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.098053 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvs9q"] Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.102416 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvs9q"] Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.112897 4869 scope.go:117] "RemoveContainer" containerID="70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.128825 4869 scope.go:117] "RemoveContainer" containerID="41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.165819 4869 scope.go:117] "RemoveContainer" containerID="4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.196179 4869 scope.go:117] "RemoveContainer" containerID="4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.212129 4869 scope.go:117] "RemoveContainer" containerID="b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.226917 4869 scope.go:117] "RemoveContainer" containerID="e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.245923 4869 scope.go:117] "RemoveContainer" containerID="8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.271173 4869 scope.go:117] "RemoveContainer" containerID="fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.271943 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee\": container with ID starting with fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee not found: ID does not exist" containerID="fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.271987 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee"} err="failed to get container status \"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee\": rpc error: code = NotFound desc = could not find container \"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee\": container with ID starting with fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.272016 4869 scope.go:117] "RemoveContainer" containerID="0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.272392 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6\": container with ID starting with 0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6 not found: ID does not exist" containerID="0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.272416 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6"} err="failed to get container status \"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6\": rpc error: code = NotFound desc = could not find container \"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6\": container with ID starting with 0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.272429 4869 scope.go:117] "RemoveContainer" containerID="70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.272709 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd\": container with ID starting with 70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd not found: ID does not exist" containerID="70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.272739 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd"} err="failed to get container status \"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd\": rpc error: code = NotFound desc = could not find container \"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd\": container with ID starting with 70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.272772 4869 scope.go:117] "RemoveContainer" containerID="41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.273069 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f\": container with ID starting with 41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f not found: ID does not exist" containerID="41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273094 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f"} err="failed to get container status \"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f\": rpc error: code = NotFound desc = could not find container \"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f\": container with ID starting with 41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273109 4869 scope.go:117] "RemoveContainer" containerID="4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.273333 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92\": container with ID starting with 4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92 not found: ID does not exist" containerID="4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273365 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92"} err="failed to get container status \"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92\": rpc error: code = NotFound desc = could not find container \"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92\": container with ID starting with 4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273383 4869 scope.go:117] "RemoveContainer" containerID="4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.273599 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9\": container with ID starting with 4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9 not found: ID does not exist" containerID="4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273629 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9"} err="failed to get container status \"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9\": rpc error: code = NotFound desc = could not find container \"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9\": container with ID starting with 4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.273645 4869 scope.go:117] "RemoveContainer" containerID="b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.274038 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a\": container with ID starting with b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a not found: ID does not exist" containerID="b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.274107 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a"} err="failed to get container status \"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a\": rpc error: code = NotFound desc = could not find container \"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a\": container with ID starting with b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.274159 4869 scope.go:117] "RemoveContainer" containerID="e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.274632 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2\": container with ID starting with e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2 not found: ID does not exist" containerID="e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.274655 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2"} err="failed to get container status \"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2\": rpc error: code = NotFound desc = could not find container \"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2\": container with ID starting with e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.274669 4869 scope.go:117] "RemoveContainer" containerID="8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54" Feb 18 05:57:43 crc kubenswrapper[4869]: E0218 05:57:43.274945 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54\": container with ID starting with 8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54 not found: ID does not exist" containerID="8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.275374 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54"} err="failed to get container status \"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54\": rpc error: code = NotFound desc = could not find container \"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54\": container with ID starting with 8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.275394 4869 scope.go:117] "RemoveContainer" containerID="fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.275659 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee"} err="failed to get container status \"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee\": rpc error: code = NotFound desc = could not find container \"fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee\": container with ID starting with fdf57f8214c9edcc60b7f0fe48fd3e36de28209082e5713a5f43dc740db378ee not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.275686 4869 scope.go:117] "RemoveContainer" containerID="0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.276032 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6"} err="failed to get container status \"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6\": rpc error: code = NotFound desc = could not find container \"0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6\": container with ID starting with 0837ffbf5f6a9c3305554c6fb64fe0352d25511750c6d4e61082f168d3b9a7e6 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.276058 4869 scope.go:117] "RemoveContainer" containerID="70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.276554 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd"} err="failed to get container status \"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd\": rpc error: code = NotFound desc = could not find container \"70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd\": container with ID starting with 70862fff347446129454a43683a347883e36d81a5d909013b5314500b33312cd not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.276575 4869 scope.go:117] "RemoveContainer" containerID="41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.277055 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f"} err="failed to get container status \"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f\": rpc error: code = NotFound desc = could not find container \"41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f\": container with ID starting with 41418d0c8201ad9058504d96c160609a322ed6366875006af5d63e23b22d5a2f not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.277103 4869 scope.go:117] "RemoveContainer" containerID="4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.278853 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92"} err="failed to get container status \"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92\": rpc error: code = NotFound desc = could not find container \"4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92\": container with ID starting with 4ee8aadf6ad995457023ee0e612a1c97316952c5e23c95031d54c41f8eb46f92 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.278881 4869 scope.go:117] "RemoveContainer" containerID="4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.279455 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9"} err="failed to get container status \"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9\": rpc error: code = NotFound desc = could not find container \"4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9\": container with ID starting with 4000b9032973107bf63573949eba7869ac1e56195a634cbe30d60dea8cdc92c9 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.279481 4869 scope.go:117] "RemoveContainer" containerID="b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.279951 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a"} err="failed to get container status \"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a\": rpc error: code = NotFound desc = could not find container \"b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a\": container with ID starting with b109af6f911eae9240a588e5fc6d09fac51c9d272033edc23d5dec25cfad746a not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.279984 4869 scope.go:117] "RemoveContainer" containerID="e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.280322 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2"} err="failed to get container status \"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2\": rpc error: code = NotFound desc = could not find container \"e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2\": container with ID starting with e7307f4b57ff0bdeb6c357526c1b4f36e8187d31776212a82f1cfa23b0967ad2 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.280348 4869 scope.go:117] "RemoveContainer" containerID="8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.280639 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54"} err="failed to get container status \"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54\": rpc error: code = NotFound desc = could not find container \"8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54\": container with ID starting with 8854492f2a0b8e6232e63a3dc435950034b82d8b6738f0f98a70fc5ff5bfcf54 not found: ID does not exist" Feb 18 05:57:43 crc kubenswrapper[4869]: I0218 05:57:43.479159 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30082d3-c125-4e76-8ead-3633b967d974" path="/var/lib/kubelet/pods/a30082d3-c125-4e76-8ead-3633b967d974/volumes" Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.017771 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"59373df41582e29bf5dd9b85b368c4662ab3f478f945c7b21ecf628460975d1a"} Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.018090 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"70f3ca22db24d27e6925c0565b859bacd4c85bfb93573639e60b4002ad810518"} Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.018104 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"01e8fb64269fe2c20b13f7978adbf74013cfd4456c55bfcf4c29bde30291a415"} Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.018115 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"537410861a057e1a3e65402d2ed0b872e705c03ac83714054c2e5d2b6cf6d424"} Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.018124 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"37805fb8eb8c9a45f0368a39754baa67b18e854d98037a3e398ba7d849a43631"} Feb 18 05:57:44 crc kubenswrapper[4869]: I0218 05:57:44.018132 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"f6ea1110dd1ed7c9ac2a71d483b900caa7d601e6ac69ff447cf74683c30cee58"} Feb 18 05:57:46 crc kubenswrapper[4869]: I0218 05:57:46.034766 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"f7f3126ea497a8c7dcd450f1eb1e3b66832a273a432eb76fac4a74ff833cdd44"} Feb 18 05:57:49 crc kubenswrapper[4869]: I0218 05:57:49.054615 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" event={"ID":"e7b841aa-01cd-48f0-b079-ce75b438144c","Type":"ContainerStarted","Data":"856e03357261cded2efc23d939ef68c7caeed40334000c16f20026b51d0510c6"} Feb 18 05:57:49 crc kubenswrapper[4869]: I0218 05:57:49.058947 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:49 crc kubenswrapper[4869]: I0218 05:57:49.058980 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:49 crc kubenswrapper[4869]: I0218 05:57:49.091594 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" podStartSLOduration=7.091570888 podStartE2EDuration="7.091570888s" podCreationTimestamp="2026-02-18 05:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:57:49.089473627 +0000 UTC m=+566.258561909" watchObservedRunningTime="2026-02-18 05:57:49.091570888 +0000 UTC m=+566.260659130" Feb 18 05:57:49 crc kubenswrapper[4869]: I0218 05:57:49.093351 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:50 crc kubenswrapper[4869]: I0218 05:57:50.061972 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:57:50 crc kubenswrapper[4869]: I0218 05:57:50.086372 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:58:10 crc kubenswrapper[4869]: I0218 05:58:10.132786 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:58:10 crc kubenswrapper[4869]: I0218 05:58:10.133868 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:58:10 crc kubenswrapper[4869]: I0218 05:58:10.133946 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 05:58:10 crc kubenswrapper[4869]: I0218 05:58:10.135213 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:58:10 crc kubenswrapper[4869]: I0218 05:58:10.135331 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105" gracePeriod=600 Feb 18 05:58:11 crc kubenswrapper[4869]: I0218 05:58:11.192017 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105" exitCode=0 Feb 18 05:58:11 crc kubenswrapper[4869]: I0218 05:58:11.192093 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105"} Feb 18 05:58:11 crc kubenswrapper[4869]: I0218 05:58:11.192505 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9"} Feb 18 05:58:11 crc kubenswrapper[4869]: I0218 05:58:11.192536 4869 scope.go:117] "RemoveContainer" containerID="e6cda088debb8f246e50d84872e8120984c327e81d95605578875258e09eeddf" Feb 18 05:58:12 crc kubenswrapper[4869]: I0218 05:58:12.551228 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmp96" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.271933 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7"] Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.273405 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.278936 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.279948 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7"] Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.406697 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.406847 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.406893 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzs95\" (UniqueName: \"kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.508074 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.508162 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzs95\" (UniqueName: \"kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.508201 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.509081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.509096 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.527951 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzs95\" (UniqueName: \"kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.611076 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:19 crc kubenswrapper[4869]: I0218 05:58:19.796054 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7"] Feb 18 05:58:20 crc kubenswrapper[4869]: I0218 05:58:20.242280 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd932d93-4a7d-4779-8833-23887167b576" containerID="ccfb9c406b0ebeb83b6de3b2104fb2f0c3876795bccf0a0f8630ebd14aafb1ca" exitCode=0 Feb 18 05:58:20 crc kubenswrapper[4869]: I0218 05:58:20.242555 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" event={"ID":"dd932d93-4a7d-4779-8833-23887167b576","Type":"ContainerDied","Data":"ccfb9c406b0ebeb83b6de3b2104fb2f0c3876795bccf0a0f8630ebd14aafb1ca"} Feb 18 05:58:20 crc kubenswrapper[4869]: I0218 05:58:20.242661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" event={"ID":"dd932d93-4a7d-4779-8833-23887167b576","Type":"ContainerStarted","Data":"9ca9bb1765d7f6a9f04ab8db225fe7c71edf28a8eb39a5f9b8e3ed91a79b356e"} Feb 18 05:58:22 crc kubenswrapper[4869]: I0218 05:58:22.258396 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd932d93-4a7d-4779-8833-23887167b576" containerID="9452267d323c474a54145b58802f227f8a9851622ae37cb199f0c768e26e8d3e" exitCode=0 Feb 18 05:58:22 crc kubenswrapper[4869]: I0218 05:58:22.258478 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" event={"ID":"dd932d93-4a7d-4779-8833-23887167b576","Type":"ContainerDied","Data":"9452267d323c474a54145b58802f227f8a9851622ae37cb199f0c768e26e8d3e"} Feb 18 05:58:23 crc kubenswrapper[4869]: I0218 05:58:23.271482 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd932d93-4a7d-4779-8833-23887167b576" containerID="a640f799aa3d6bb1862ba43e07ebf817c4ac7d5ea65a03bbf4fe25715685e6ed" exitCode=0 Feb 18 05:58:23 crc kubenswrapper[4869]: I0218 05:58:23.271522 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" event={"ID":"dd932d93-4a7d-4779-8833-23887167b576","Type":"ContainerDied","Data":"a640f799aa3d6bb1862ba43e07ebf817c4ac7d5ea65a03bbf4fe25715685e6ed"} Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.567313 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.669388 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle\") pod \"dd932d93-4a7d-4779-8833-23887167b576\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.669475 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util\") pod \"dd932d93-4a7d-4779-8833-23887167b576\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.669525 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzs95\" (UniqueName: \"kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95\") pod \"dd932d93-4a7d-4779-8833-23887167b576\" (UID: \"dd932d93-4a7d-4779-8833-23887167b576\") " Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.670343 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle" (OuterVolumeSpecName: "bundle") pod "dd932d93-4a7d-4779-8833-23887167b576" (UID: "dd932d93-4a7d-4779-8833-23887167b576"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.675250 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95" (OuterVolumeSpecName: "kube-api-access-gzs95") pod "dd932d93-4a7d-4779-8833-23887167b576" (UID: "dd932d93-4a7d-4779-8833-23887167b576"). InnerVolumeSpecName "kube-api-access-gzs95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.702865 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util" (OuterVolumeSpecName: "util") pod "dd932d93-4a7d-4779-8833-23887167b576" (UID: "dd932d93-4a7d-4779-8833-23887167b576"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.771331 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzs95\" (UniqueName: \"kubernetes.io/projected/dd932d93-4a7d-4779-8833-23887167b576-kube-api-access-gzs95\") on node \"crc\" DevicePath \"\"" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.771396 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:58:24 crc kubenswrapper[4869]: I0218 05:58:24.771408 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd932d93-4a7d-4779-8833-23887167b576-util\") on node \"crc\" DevicePath \"\"" Feb 18 05:58:25 crc kubenswrapper[4869]: I0218 05:58:25.287510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" event={"ID":"dd932d93-4a7d-4779-8833-23887167b576","Type":"ContainerDied","Data":"9ca9bb1765d7f6a9f04ab8db225fe7c71edf28a8eb39a5f9b8e3ed91a79b356e"} Feb 18 05:58:25 crc kubenswrapper[4869]: I0218 05:58:25.287553 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca9bb1765d7f6a9f04ab8db225fe7c71edf28a8eb39a5f9b8e3ed91a79b356e" Feb 18 05:58:25 crc kubenswrapper[4869]: I0218 05:58:25.287616 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.979936 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-mfnzm"] Feb 18 05:58:26 crc kubenswrapper[4869]: E0218 05:58:26.980173 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="extract" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.980188 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="extract" Feb 18 05:58:26 crc kubenswrapper[4869]: E0218 05:58:26.980207 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="pull" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.980213 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="pull" Feb 18 05:58:26 crc kubenswrapper[4869]: E0218 05:58:26.980227 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="util" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.980234 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="util" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.980343 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd932d93-4a7d-4779-8833-23887167b576" containerName="extract" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.980806 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.983221 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.986179 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.986638 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-j29w8" Feb 18 05:58:26 crc kubenswrapper[4869]: I0218 05:58:26.991379 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-mfnzm"] Feb 18 05:58:27 crc kubenswrapper[4869]: I0218 05:58:27.100819 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthpw\" (UniqueName: \"kubernetes.io/projected/851e9bf0-c3bc-454d-a3e3-ade0dd734f5a-kube-api-access-fthpw\") pod \"nmstate-operator-694c9596b7-mfnzm\" (UID: \"851e9bf0-c3bc-454d-a3e3-ade0dd734f5a\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" Feb 18 05:58:27 crc kubenswrapper[4869]: I0218 05:58:27.202339 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fthpw\" (UniqueName: \"kubernetes.io/projected/851e9bf0-c3bc-454d-a3e3-ade0dd734f5a-kube-api-access-fthpw\") pod \"nmstate-operator-694c9596b7-mfnzm\" (UID: \"851e9bf0-c3bc-454d-a3e3-ade0dd734f5a\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" Feb 18 05:58:27 crc kubenswrapper[4869]: I0218 05:58:27.221703 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthpw\" (UniqueName: \"kubernetes.io/projected/851e9bf0-c3bc-454d-a3e3-ade0dd734f5a-kube-api-access-fthpw\") pod \"nmstate-operator-694c9596b7-mfnzm\" (UID: \"851e9bf0-c3bc-454d-a3e3-ade0dd734f5a\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" Feb 18 05:58:27 crc kubenswrapper[4869]: I0218 05:58:27.296548 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" Feb 18 05:58:27 crc kubenswrapper[4869]: I0218 05:58:27.486085 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-mfnzm"] Feb 18 05:58:28 crc kubenswrapper[4869]: I0218 05:58:28.302866 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" event={"ID":"851e9bf0-c3bc-454d-a3e3-ade0dd734f5a","Type":"ContainerStarted","Data":"922a7d5be8b4913b78d989b9bdceaa03048e0209688a069b95f120813793de0d"} Feb 18 05:58:30 crc kubenswrapper[4869]: I0218 05:58:30.314652 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" event={"ID":"851e9bf0-c3bc-454d-a3e3-ade0dd734f5a","Type":"ContainerStarted","Data":"2e09902cb4a34bea8b6aa7e1bb6feb1ef3f26aafa5e422cb0bd1c2ef98a059a1"} Feb 18 05:58:30 crc kubenswrapper[4869]: I0218 05:58:30.331241 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-mfnzm" podStartSLOduration=2.3626112040000002 podStartE2EDuration="4.331224369s" podCreationTimestamp="2026-02-18 05:58:26 +0000 UTC" firstStartedPulling="2026-02-18 05:58:27.491013276 +0000 UTC m=+604.660101508" lastFinishedPulling="2026-02-18 05:58:29.459626451 +0000 UTC m=+606.628714673" observedRunningTime="2026-02-18 05:58:30.329231231 +0000 UTC m=+607.498319473" watchObservedRunningTime="2026-02-18 05:58:30.331224369 +0000 UTC m=+607.500312591" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.343911 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.345176 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.348434 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.348638 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ndhsd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.356729 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-tkd86"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.357860 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.364272 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.396810 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-tkd86"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.399433 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kljbt"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.400382 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.446846 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pfc\" (UniqueName: \"kubernetes.io/projected/02b9ed08-af4d-434a-8042-9b4acedf423c-kube-api-access-m7pfc\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.446919 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnts\" (UniqueName: \"kubernetes.io/projected/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-kube-api-access-5vnts\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.446989 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-nmstate-lock\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.447016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-ovs-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.447059 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggs2l\" (UniqueName: \"kubernetes.io/projected/9c59b69a-72a6-4ce0-9b47-c53016b5ac3a-kube-api-access-ggs2l\") pod \"nmstate-metrics-58c85c668d-tkd86\" (UID: \"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.447145 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-dbus-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.447169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.533483 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.534285 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.541138 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.541264 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-chkpd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.541381 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.547495 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-nmstate-lock\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548435 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/babd7df7-75bc-470a-a960-c1d0317f2f8e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548462 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-ovs-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548502 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggs2l\" (UniqueName: \"kubernetes.io/projected/9c59b69a-72a6-4ce0-9b47-c53016b5ac3a-kube-api-access-ggs2l\") pod \"nmstate-metrics-58c85c668d-tkd86\" (UID: \"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmfbx\" (UniqueName: \"kubernetes.io/projected/babd7df7-75bc-470a-a960-c1d0317f2f8e-kube-api-access-kmfbx\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548559 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-ovs-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548586 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-dbus-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548502 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-nmstate-lock\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.548880 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02b9ed08-af4d-434a-8042-9b4acedf423c-dbus-socket\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: E0218 05:58:31.548914 4869 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 18 05:58:31 crc kubenswrapper[4869]: E0218 05:58:31.549006 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair podName:1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92 nodeName:}" failed. No retries permitted until 2026-02-18 05:58:32.048988869 +0000 UTC m=+609.218077101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair") pod "nmstate-webhook-866bcb46dc-j5g4n" (UID: "1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92") : secret "openshift-nmstate-webhook" not found Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.549050 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pfc\" (UniqueName: \"kubernetes.io/projected/02b9ed08-af4d-434a-8042-9b4acedf423c-kube-api-access-m7pfc\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.549127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnts\" (UniqueName: \"kubernetes.io/projected/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-kube-api-access-5vnts\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.549180 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.572109 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggs2l\" (UniqueName: \"kubernetes.io/projected/9c59b69a-72a6-4ce0-9b47-c53016b5ac3a-kube-api-access-ggs2l\") pod \"nmstate-metrics-58c85c668d-tkd86\" (UID: \"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.576325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pfc\" (UniqueName: \"kubernetes.io/projected/02b9ed08-af4d-434a-8042-9b4acedf423c-kube-api-access-m7pfc\") pod \"nmstate-handler-kljbt\" (UID: \"02b9ed08-af4d-434a-8042-9b4acedf423c\") " pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.577485 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnts\" (UniqueName: \"kubernetes.io/projected/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-kube-api-access-5vnts\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.650928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.650999 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/babd7df7-75bc-470a-a960-c1d0317f2f8e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.651035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmfbx\" (UniqueName: \"kubernetes.io/projected/babd7df7-75bc-470a-a960-c1d0317f2f8e-kube-api-access-kmfbx\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: E0218 05:58:31.651414 4869 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 18 05:58:31 crc kubenswrapper[4869]: E0218 05:58:31.651484 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert podName:babd7df7-75bc-470a-a960-c1d0317f2f8e nodeName:}" failed. No retries permitted until 2026-02-18 05:58:32.151466879 +0000 UTC m=+609.320555111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-x7pfd" (UID: "babd7df7-75bc-470a-a960-c1d0317f2f8e") : secret "plugin-serving-cert" not found Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.652833 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/babd7df7-75bc-470a-a960-c1d0317f2f8e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.671017 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmfbx\" (UniqueName: \"kubernetes.io/projected/babd7df7-75bc-470a-a960-c1d0317f2f8e-kube-api-access-kmfbx\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.693889 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.716799 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.720499 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ffb9bff49-bf2qv"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.721152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.731826 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffb9bff49-bf2qv"] Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.751917 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.751984 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-trusted-ca-bundle\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.752060 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-oauth-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.752111 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-oauth-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.752134 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-service-ca\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.752158 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-kube-api-access-6rvwk\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.752186 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: W0218 05:58:31.772453 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b9ed08_af4d_434a_8042_9b4acedf423c.slice/crio-161c2b1d8330c6ad7ef09987f848a83cbd5d7d41e7cda6cbcfd4ba6d04623f10 WatchSource:0}: Error finding container 161c2b1d8330c6ad7ef09987f848a83cbd5d7d41e7cda6cbcfd4ba6d04623f10: Status 404 returned error can't find the container with id 161c2b1d8330c6ad7ef09987f848a83cbd5d7d41e7cda6cbcfd4ba6d04623f10 Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852604 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-oauth-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-service-ca\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852654 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-kube-api-access-6rvwk\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852670 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852714 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-trusted-ca-bundle\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.852776 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-oauth-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.854530 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.854772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-oauth-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.855155 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-service-ca\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.855855 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-trusted-ca-bundle\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.858406 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-oauth-config\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.858782 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-console-serving-cert\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.874826 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/71fc95a1-d2f5-4136-9e8d-a4d9a8582351-kube-api-access-6rvwk\") pod \"console-6ffb9bff49-bf2qv\" (UID: \"71fc95a1-d2f5-4136-9e8d-a4d9a8582351\") " pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:31 crc kubenswrapper[4869]: I0218 05:58:31.937888 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-tkd86"] Feb 18 05:58:31 crc kubenswrapper[4869]: W0218 05:58:31.942783 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c59b69a_72a6_4ce0_9b47_c53016b5ac3a.slice/crio-b7cdecefd380233e0025f68cca12345728692ba30e2b038fcd0bece6a4188881 WatchSource:0}: Error finding container b7cdecefd380233e0025f68cca12345728692ba30e2b038fcd0bece6a4188881: Status 404 returned error can't find the container with id b7cdecefd380233e0025f68cca12345728692ba30e2b038fcd0bece6a4188881 Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.054599 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.058123 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-j5g4n\" (UID: \"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.066010 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.155995 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.159170 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/babd7df7-75bc-470a-a960-c1d0317f2f8e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x7pfd\" (UID: \"babd7df7-75bc-470a-a960-c1d0317f2f8e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.159297 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.219225 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffb9bff49-bf2qv"] Feb 18 05:58:32 crc kubenswrapper[4869]: W0218 05:58:32.229992 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71fc95a1_d2f5_4136_9e8d_a4d9a8582351.slice/crio-ada3bd042c079797b90a24723318dfd0cd2a80b6c0dc9a8ac691837ec0ee5f18 WatchSource:0}: Error finding container ada3bd042c079797b90a24723318dfd0cd2a80b6c0dc9a8ac691837ec0ee5f18: Status 404 returned error can't find the container with id ada3bd042c079797b90a24723318dfd0cd2a80b6c0dc9a8ac691837ec0ee5f18 Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.280573 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.329320 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd"] Feb 18 05:58:32 crc kubenswrapper[4869]: W0218 05:58:32.332318 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabd7df7_75bc_470a_a960_c1d0317f2f8e.slice/crio-b75f1cd22bfb9799d508854a478608c218db6a4f89789e50b7864c34878d5dda WatchSource:0}: Error finding container b75f1cd22bfb9799d508854a478608c218db6a4f89789e50b7864c34878d5dda: Status 404 returned error can't find the container with id b75f1cd22bfb9799d508854a478608c218db6a4f89789e50b7864c34878d5dda Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.333592 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffb9bff49-bf2qv" event={"ID":"71fc95a1-d2f5-4136-9e8d-a4d9a8582351","Type":"ContainerStarted","Data":"ada3bd042c079797b90a24723318dfd0cd2a80b6c0dc9a8ac691837ec0ee5f18"} Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.334797 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" event={"ID":"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a","Type":"ContainerStarted","Data":"b7cdecefd380233e0025f68cca12345728692ba30e2b038fcd0bece6a4188881"} Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.335721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kljbt" event={"ID":"02b9ed08-af4d-434a-8042-9b4acedf423c","Type":"ContainerStarted","Data":"161c2b1d8330c6ad7ef09987f848a83cbd5d7d41e7cda6cbcfd4ba6d04623f10"} Feb 18 05:58:32 crc kubenswrapper[4869]: I0218 05:58:32.477580 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n"] Feb 18 05:58:32 crc kubenswrapper[4869]: W0218 05:58:32.481070 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1582a9d4_d3f8_4ca1_8853_6e6e8cc10d92.slice/crio-798ece5631ecf4671f344c3a3fec90aa56c52f7a6e3ec0cb4acd27321900e87d WatchSource:0}: Error finding container 798ece5631ecf4671f344c3a3fec90aa56c52f7a6e3ec0cb4acd27321900e87d: Status 404 returned error can't find the container with id 798ece5631ecf4671f344c3a3fec90aa56c52f7a6e3ec0cb4acd27321900e87d Feb 18 05:58:33 crc kubenswrapper[4869]: I0218 05:58:33.343534 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffb9bff49-bf2qv" event={"ID":"71fc95a1-d2f5-4136-9e8d-a4d9a8582351","Type":"ContainerStarted","Data":"82f20ef361ba32c1e49ac7477500eab2311addcdc273cffbe23e2563a6530a07"} Feb 18 05:58:33 crc kubenswrapper[4869]: I0218 05:58:33.346616 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" event={"ID":"babd7df7-75bc-470a-a960-c1d0317f2f8e","Type":"ContainerStarted","Data":"b75f1cd22bfb9799d508854a478608c218db6a4f89789e50b7864c34878d5dda"} Feb 18 05:58:33 crc kubenswrapper[4869]: I0218 05:58:33.348239 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" event={"ID":"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92","Type":"ContainerStarted","Data":"798ece5631ecf4671f344c3a3fec90aa56c52f7a6e3ec0cb4acd27321900e87d"} Feb 18 05:58:33 crc kubenswrapper[4869]: I0218 05:58:33.361551 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ffb9bff49-bf2qv" podStartSLOduration=2.3615343429999998 podStartE2EDuration="2.361534343s" podCreationTimestamp="2026-02-18 05:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:58:33.361330287 +0000 UTC m=+610.530418519" watchObservedRunningTime="2026-02-18 05:58:33.361534343 +0000 UTC m=+610.530622565" Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.363570 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" event={"ID":"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a","Type":"ContainerStarted","Data":"aa332b3766f30eaad217b989539cc981c9cff710d197643efa768ecd15a9c9fc"} Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.365372 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kljbt" event={"ID":"02b9ed08-af4d-434a-8042-9b4acedf423c","Type":"ContainerStarted","Data":"2ac5148ef7fc12040cc19ed223bfd26d9a6e487d78be2e2a19d0561e6e0381fd"} Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.365509 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.367434 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" event={"ID":"babd7df7-75bc-470a-a960-c1d0317f2f8e","Type":"ContainerStarted","Data":"428a781811d49f85033cae8a322ed00bde6c800f7dee3792b7006bf63b3cf5f1"} Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.369257 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" event={"ID":"1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92","Type":"ContainerStarted","Data":"dc6c143c9cbcbb15b727d0984abb5a67a254f3b49e4b0b173ef1049e079c71da"} Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.369419 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.383703 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kljbt" podStartSLOduration=1.7275351570000002 podStartE2EDuration="4.383680728s" podCreationTimestamp="2026-02-18 05:58:31 +0000 UTC" firstStartedPulling="2026-02-18 05:58:31.778352843 +0000 UTC m=+608.947441075" lastFinishedPulling="2026-02-18 05:58:34.434498414 +0000 UTC m=+611.603586646" observedRunningTime="2026-02-18 05:58:35.38207676 +0000 UTC m=+612.551164992" watchObservedRunningTime="2026-02-18 05:58:35.383680728 +0000 UTC m=+612.552768960" Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.399634 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" podStartSLOduration=2.4423668960000002 podStartE2EDuration="4.399617025s" podCreationTimestamp="2026-02-18 05:58:31 +0000 UTC" firstStartedPulling="2026-02-18 05:58:32.48400903 +0000 UTC m=+609.653097262" lastFinishedPulling="2026-02-18 05:58:34.441259149 +0000 UTC m=+611.610347391" observedRunningTime="2026-02-18 05:58:35.395234109 +0000 UTC m=+612.564322341" watchObservedRunningTime="2026-02-18 05:58:35.399617025 +0000 UTC m=+612.568705257" Feb 18 05:58:35 crc kubenswrapper[4869]: I0218 05:58:35.418131 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x7pfd" podStartSLOduration=2.314627833 podStartE2EDuration="4.418106394s" podCreationTimestamp="2026-02-18 05:58:31 +0000 UTC" firstStartedPulling="2026-02-18 05:58:32.334375624 +0000 UTC m=+609.503463856" lastFinishedPulling="2026-02-18 05:58:34.437854185 +0000 UTC m=+611.606942417" observedRunningTime="2026-02-18 05:58:35.411256498 +0000 UTC m=+612.580344740" watchObservedRunningTime="2026-02-18 05:58:35.418106394 +0000 UTC m=+612.587194636" Feb 18 05:58:37 crc kubenswrapper[4869]: I0218 05:58:37.389904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" event={"ID":"9c59b69a-72a6-4ce0-9b47-c53016b5ac3a","Type":"ContainerStarted","Data":"fa98d9d144b13a5eca2a7b8320aa94a5811d0f50af0977a4e1aed9f39352165f"} Feb 18 05:58:37 crc kubenswrapper[4869]: I0218 05:58:37.408006 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-tkd86" podStartSLOduration=1.772211472 podStartE2EDuration="6.407986407s" podCreationTimestamp="2026-02-18 05:58:31 +0000 UTC" firstStartedPulling="2026-02-18 05:58:31.945304559 +0000 UTC m=+609.114392791" lastFinishedPulling="2026-02-18 05:58:36.581079494 +0000 UTC m=+613.750167726" observedRunningTime="2026-02-18 05:58:37.405280031 +0000 UTC m=+614.574368273" watchObservedRunningTime="2026-02-18 05:58:37.407986407 +0000 UTC m=+614.577074639" Feb 18 05:58:41 crc kubenswrapper[4869]: I0218 05:58:41.739521 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kljbt" Feb 18 05:58:42 crc kubenswrapper[4869]: I0218 05:58:42.066850 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:42 crc kubenswrapper[4869]: I0218 05:58:42.066980 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:42 crc kubenswrapper[4869]: I0218 05:58:42.075850 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:42 crc kubenswrapper[4869]: I0218 05:58:42.426717 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ffb9bff49-bf2qv" Feb 18 05:58:42 crc kubenswrapper[4869]: I0218 05:58:42.494208 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:58:52 crc kubenswrapper[4869]: I0218 05:58:52.290069 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-j5g4n" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.307386 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d"] Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.309634 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.314192 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d"] Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.314934 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.414485 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.414582 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t96sp\" (UniqueName: \"kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.414606 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.515882 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.515981 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t96sp\" (UniqueName: \"kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.516001 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.516646 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.516968 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.540509 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t96sp\" (UniqueName: \"kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.633103 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:05 crc kubenswrapper[4869]: I0218 05:59:05.942671 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d"] Feb 18 05:59:05 crc kubenswrapper[4869]: W0218 05:59:05.950361 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1730c713_0f10_4294_a20d_c7d7a2d4403f.slice/crio-0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3 WatchSource:0}: Error finding container 0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3: Status 404 returned error can't find the container with id 0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3 Feb 18 05:59:06 crc kubenswrapper[4869]: I0218 05:59:06.576766 4869 generic.go:334] "Generic (PLEG): container finished" podID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerID="2157dee02b26112772621e0e9d890486d160c0810af43979593ba31f4a8149b1" exitCode=0 Feb 18 05:59:06 crc kubenswrapper[4869]: I0218 05:59:06.576812 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" event={"ID":"1730c713-0f10-4294-a20d-c7d7a2d4403f","Type":"ContainerDied","Data":"2157dee02b26112772621e0e9d890486d160c0810af43979593ba31f4a8149b1"} Feb 18 05:59:06 crc kubenswrapper[4869]: I0218 05:59:06.576838 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" event={"ID":"1730c713-0f10-4294-a20d-c7d7a2d4403f","Type":"ContainerStarted","Data":"0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3"} Feb 18 05:59:07 crc kubenswrapper[4869]: I0218 05:59:07.554782 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tc859" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" containerID="cri-o://1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7" gracePeriod=15 Feb 18 05:59:07 crc kubenswrapper[4869]: I0218 05:59:07.964702 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tc859_97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2/console/0.log" Feb 18 05:59:07 crc kubenswrapper[4869]: I0218 05:59:07.965255 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158048 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158113 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158138 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158155 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqdv5\" (UniqueName: \"kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158229 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158250 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.158328 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle\") pod \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\" (UID: \"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2\") " Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.159708 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.160356 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.160518 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config" (OuterVolumeSpecName: "console-config") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.160672 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.170721 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.171296 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.171319 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5" (OuterVolumeSpecName: "kube-api-access-dqdv5") pod "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" (UID: "97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2"). InnerVolumeSpecName "kube-api-access-dqdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.260889 4869 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.260995 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.261014 4869 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.261029 4869 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.261042 4869 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.261056 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqdv5\" (UniqueName: \"kubernetes.io/projected/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-kube-api-access-dqdv5\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.261073 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.606870 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tc859_97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2/console/0.log" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.607020 4869 generic.go:334] "Generic (PLEG): container finished" podID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerID="1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7" exitCode=2 Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.607087 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tc859" event={"ID":"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2","Type":"ContainerDied","Data":"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7"} Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.607119 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tc859" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.607170 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tc859" event={"ID":"97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2","Type":"ContainerDied","Data":"2c90b0b6a107b8b2849e81696509f4d01f642eeedd2c6cec1152eabb918f5b40"} Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.607207 4869 scope.go:117] "RemoveContainer" containerID="1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.611408 4869 generic.go:334] "Generic (PLEG): container finished" podID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerID="f8d10c5f622531baf094903332b053f541b7a7dd591720308044d4a9ec7705bb" exitCode=0 Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.611460 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" event={"ID":"1730c713-0f10-4294-a20d-c7d7a2d4403f","Type":"ContainerDied","Data":"f8d10c5f622531baf094903332b053f541b7a7dd591720308044d4a9ec7705bb"} Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.637213 4869 scope.go:117] "RemoveContainer" containerID="1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7" Feb 18 05:59:08 crc kubenswrapper[4869]: E0218 05:59:08.638781 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7\": container with ID starting with 1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7 not found: ID does not exist" containerID="1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.638963 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7"} err="failed to get container status \"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7\": rpc error: code = NotFound desc = could not find container \"1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7\": container with ID starting with 1fc2a44e952d5eb7634f862c74fd056922e6d1403d74934b9724ee1cc067e5b7 not found: ID does not exist" Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.663854 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:59:08 crc kubenswrapper[4869]: I0218 05:59:08.669189 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tc859"] Feb 18 05:59:09 crc kubenswrapper[4869]: I0218 05:59:09.487335 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" path="/var/lib/kubelet/pods/97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2/volumes" Feb 18 05:59:09 crc kubenswrapper[4869]: I0218 05:59:09.627180 4869 generic.go:334] "Generic (PLEG): container finished" podID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerID="bb0bea5b6537ac3c57a99683fcd9356ccec05c68bfe97a4dd037dbdd28cdba48" exitCode=0 Feb 18 05:59:09 crc kubenswrapper[4869]: I0218 05:59:09.627260 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" event={"ID":"1730c713-0f10-4294-a20d-c7d7a2d4403f","Type":"ContainerDied","Data":"bb0bea5b6537ac3c57a99683fcd9356ccec05c68bfe97a4dd037dbdd28cdba48"} Feb 18 05:59:10 crc kubenswrapper[4869]: I0218 05:59:10.907397 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.057070 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle\") pod \"1730c713-0f10-4294-a20d-c7d7a2d4403f\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.057169 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util\") pod \"1730c713-0f10-4294-a20d-c7d7a2d4403f\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.057236 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t96sp\" (UniqueName: \"kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp\") pod \"1730c713-0f10-4294-a20d-c7d7a2d4403f\" (UID: \"1730c713-0f10-4294-a20d-c7d7a2d4403f\") " Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.059105 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle" (OuterVolumeSpecName: "bundle") pod "1730c713-0f10-4294-a20d-c7d7a2d4403f" (UID: "1730c713-0f10-4294-a20d-c7d7a2d4403f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.066227 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp" (OuterVolumeSpecName: "kube-api-access-t96sp") pod "1730c713-0f10-4294-a20d-c7d7a2d4403f" (UID: "1730c713-0f10-4294-a20d-c7d7a2d4403f"). InnerVolumeSpecName "kube-api-access-t96sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.070766 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util" (OuterVolumeSpecName: "util") pod "1730c713-0f10-4294-a20d-c7d7a2d4403f" (UID: "1730c713-0f10-4294-a20d-c7d7a2d4403f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.158826 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t96sp\" (UniqueName: \"kubernetes.io/projected/1730c713-0f10-4294-a20d-c7d7a2d4403f-kube-api-access-t96sp\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.158874 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.158887 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1730c713-0f10-4294-a20d-c7d7a2d4403f-util\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.252691 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:11 crc kubenswrapper[4869]: E0218 05:59:11.253092 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="extract" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253115 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="extract" Feb 18 05:59:11 crc kubenswrapper[4869]: E0218 05:59:11.253196 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253211 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" Feb 18 05:59:11 crc kubenswrapper[4869]: E0218 05:59:11.253228 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="pull" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253239 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="pull" Feb 18 05:59:11 crc kubenswrapper[4869]: E0218 05:59:11.253257 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="util" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253268 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="util" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253441 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1730c713-0f10-4294-a20d-c7d7a2d4403f" containerName="extract" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.253463 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="97572f7e-7ea7-4b5a-b1d4-cd132f0c34a2" containerName="console" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.255220 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.262064 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.262064 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw48g\" (UniqueName: \"kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.262282 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.262322 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.364167 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw48g\" (UniqueName: \"kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.364240 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.364265 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.364872 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.364950 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.386873 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw48g\" (UniqueName: \"kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g\") pod \"redhat-operators-pflwb\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.584631 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.646884 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" event={"ID":"1730c713-0f10-4294-a20d-c7d7a2d4403f","Type":"ContainerDied","Data":"0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3"} Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.646925 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba92d36b063063ca41210d34f32f23a5376f0c7daca12b0eaf659d0a91572c3" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.646995 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d" Feb 18 05:59:11 crc kubenswrapper[4869]: I0218 05:59:11.783507 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:11 crc kubenswrapper[4869]: W0218 05:59:11.798519 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4272c230_5646_4d29_ba7d_4522410469e8.slice/crio-547fe48709587b80e986fc3bcb6fc1802a78dc0075f07ea77819537711058557 WatchSource:0}: Error finding container 547fe48709587b80e986fc3bcb6fc1802a78dc0075f07ea77819537711058557: Status 404 returned error can't find the container with id 547fe48709587b80e986fc3bcb6fc1802a78dc0075f07ea77819537711058557 Feb 18 05:59:12 crc kubenswrapper[4869]: I0218 05:59:12.653533 4869 generic.go:334] "Generic (PLEG): container finished" podID="4272c230-5646-4d29-ba7d-4522410469e8" containerID="716f8ba6056bf551323e8f38420958f089ca9ddc27bcc5922befd4ae40222eff" exitCode=0 Feb 18 05:59:12 crc kubenswrapper[4869]: I0218 05:59:12.653703 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerDied","Data":"716f8ba6056bf551323e8f38420958f089ca9ddc27bcc5922befd4ae40222eff"} Feb 18 05:59:12 crc kubenswrapper[4869]: I0218 05:59:12.654022 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerStarted","Data":"547fe48709587b80e986fc3bcb6fc1802a78dc0075f07ea77819537711058557"} Feb 18 05:59:13 crc kubenswrapper[4869]: I0218 05:59:13.670016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerStarted","Data":"cb5492f62cf2e820a8143790dc23a9732127060e8083c9bd1b0e07ed05adaa2c"} Feb 18 05:59:14 crc kubenswrapper[4869]: I0218 05:59:14.681482 4869 generic.go:334] "Generic (PLEG): container finished" podID="4272c230-5646-4d29-ba7d-4522410469e8" containerID="cb5492f62cf2e820a8143790dc23a9732127060e8083c9bd1b0e07ed05adaa2c" exitCode=0 Feb 18 05:59:14 crc kubenswrapper[4869]: I0218 05:59:14.681586 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerDied","Data":"cb5492f62cf2e820a8143790dc23a9732127060e8083c9bd1b0e07ed05adaa2c"} Feb 18 05:59:15 crc kubenswrapper[4869]: I0218 05:59:15.341982 4869 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 05:59:15 crc kubenswrapper[4869]: I0218 05:59:15.690241 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerStarted","Data":"f1a01a8fd80ddae4310602bc04a9a3914ccdeba071642146f6f3617ad916e4d0"} Feb 18 05:59:15 crc kubenswrapper[4869]: I0218 05:59:15.711244 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pflwb" podStartSLOduration=2.271458249 podStartE2EDuration="4.711226191s" podCreationTimestamp="2026-02-18 05:59:11 +0000 UTC" firstStartedPulling="2026-02-18 05:59:12.655590906 +0000 UTC m=+649.824679138" lastFinishedPulling="2026-02-18 05:59:15.095358838 +0000 UTC m=+652.264447080" observedRunningTime="2026-02-18 05:59:15.708588347 +0000 UTC m=+652.877676579" watchObservedRunningTime="2026-02-18 05:59:15.711226191 +0000 UTC m=+652.880314423" Feb 18 05:59:21 crc kubenswrapper[4869]: I0218 05:59:21.585413 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:21 crc kubenswrapper[4869]: I0218 05:59:21.586857 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:21 crc kubenswrapper[4869]: I0218 05:59:21.674863 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:21 crc kubenswrapper[4869]: I0218 05:59:21.764474 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.186017 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5"] Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.187691 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.190173 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.190455 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.190596 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.191093 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.191330 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-s4d9c" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.201142 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5"] Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.338809 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-webhook-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.338877 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-apiservice-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.338907 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wkq\" (UniqueName: \"kubernetes.io/projected/74eef01f-c0d7-449c-bca9-7eb78f808110-kube-api-access-w9wkq\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.410989 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz"] Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.412224 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.414126 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7jflw" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.414568 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.414600 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.424559 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz"] Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.440218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-webhook-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.440280 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-apiservice-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.440567 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wkq\" (UniqueName: \"kubernetes.io/projected/74eef01f-c0d7-449c-bca9-7eb78f808110-kube-api-access-w9wkq\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.442141 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.461962 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.468025 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-webhook-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.469186 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74eef01f-c0d7-449c-bca9-7eb78f808110-apiservice-cert\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.472058 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.482554 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wkq\" (UniqueName: \"kubernetes.io/projected/74eef01f-c0d7-449c-bca9-7eb78f808110-kube-api-access-w9wkq\") pod \"metallb-operator-controller-manager-55df77c686-fqtt5\" (UID: \"74eef01f-c0d7-449c-bca9-7eb78f808110\") " pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.535451 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-s4d9c" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.541870 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.542225 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-webhook-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.542602 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-apiservice-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.544504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5shb\" (UniqueName: \"kubernetes.io/projected/d597c072-fd48-4245-8a9a-5a80aaa78993-kube-api-access-r5shb\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.645613 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-webhook-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.645657 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-apiservice-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.645684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5shb\" (UniqueName: \"kubernetes.io/projected/d597c072-fd48-4245-8a9a-5a80aaa78993-kube-api-access-r5shb\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.651093 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-apiservice-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.654338 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d597c072-fd48-4245-8a9a-5a80aaa78993-webhook-cert\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.671147 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5shb\" (UniqueName: \"kubernetes.io/projected/d597c072-fd48-4245-8a9a-5a80aaa78993-kube-api-access-r5shb\") pod \"metallb-operator-webhook-server-69666c74dd-pv6sz\" (UID: \"d597c072-fd48-4245-8a9a-5a80aaa78993\") " pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.725629 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:23 crc kubenswrapper[4869]: I0218 05:59:23.937349 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz"] Feb 18 05:59:24 crc kubenswrapper[4869]: I0218 05:59:24.007715 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5"] Feb 18 05:59:24 crc kubenswrapper[4869]: W0218 05:59:24.010623 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74eef01f_c0d7_449c_bca9_7eb78f808110.slice/crio-2278387c74969f5d5656844f24aaaaeadc9b05f1421410e8fbe4b7f79ec50e69 WatchSource:0}: Error finding container 2278387c74969f5d5656844f24aaaaeadc9b05f1421410e8fbe4b7f79ec50e69: Status 404 returned error can't find the container with id 2278387c74969f5d5656844f24aaaaeadc9b05f1421410e8fbe4b7f79ec50e69 Feb 18 05:59:24 crc kubenswrapper[4869]: I0218 05:59:24.237111 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:24 crc kubenswrapper[4869]: I0218 05:59:24.738176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" event={"ID":"d597c072-fd48-4245-8a9a-5a80aaa78993","Type":"ContainerStarted","Data":"7d64d41943a63a4de5e2f281d8beb43ff9c56e87682f652415d26af8ac5ea240"} Feb 18 05:59:24 crc kubenswrapper[4869]: I0218 05:59:24.739191 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" event={"ID":"74eef01f-c0d7-449c-bca9-7eb78f808110","Type":"ContainerStarted","Data":"2278387c74969f5d5656844f24aaaaeadc9b05f1421410e8fbe4b7f79ec50e69"} Feb 18 05:59:24 crc kubenswrapper[4869]: I0218 05:59:24.739352 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pflwb" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="registry-server" containerID="cri-o://f1a01a8fd80ddae4310602bc04a9a3914ccdeba071642146f6f3617ad916e4d0" gracePeriod=2 Feb 18 05:59:25 crc kubenswrapper[4869]: I0218 05:59:25.762868 4869 generic.go:334] "Generic (PLEG): container finished" podID="4272c230-5646-4d29-ba7d-4522410469e8" containerID="f1a01a8fd80ddae4310602bc04a9a3914ccdeba071642146f6f3617ad916e4d0" exitCode=0 Feb 18 05:59:25 crc kubenswrapper[4869]: I0218 05:59:25.762954 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerDied","Data":"f1a01a8fd80ddae4310602bc04a9a3914ccdeba071642146f6f3617ad916e4d0"} Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.230590 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.281964 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content\") pod \"4272c230-5646-4d29-ba7d-4522410469e8\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.282043 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw48g\" (UniqueName: \"kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g\") pod \"4272c230-5646-4d29-ba7d-4522410469e8\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.282122 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities\") pod \"4272c230-5646-4d29-ba7d-4522410469e8\" (UID: \"4272c230-5646-4d29-ba7d-4522410469e8\") " Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.283571 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities" (OuterVolumeSpecName: "utilities") pod "4272c230-5646-4d29-ba7d-4522410469e8" (UID: "4272c230-5646-4d29-ba7d-4522410469e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.289103 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g" (OuterVolumeSpecName: "kube-api-access-kw48g") pod "4272c230-5646-4d29-ba7d-4522410469e8" (UID: "4272c230-5646-4d29-ba7d-4522410469e8"). InnerVolumeSpecName "kube-api-access-kw48g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.390409 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.390438 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw48g\" (UniqueName: \"kubernetes.io/projected/4272c230-5646-4d29-ba7d-4522410469e8-kube-api-access-kw48g\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.437909 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4272c230-5646-4d29-ba7d-4522410469e8" (UID: "4272c230-5646-4d29-ba7d-4522410469e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.492815 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4272c230-5646-4d29-ba7d-4522410469e8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.772075 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pflwb" event={"ID":"4272c230-5646-4d29-ba7d-4522410469e8","Type":"ContainerDied","Data":"547fe48709587b80e986fc3bcb6fc1802a78dc0075f07ea77819537711058557"} Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.772136 4869 scope.go:117] "RemoveContainer" containerID="f1a01a8fd80ddae4310602bc04a9a3914ccdeba071642146f6f3617ad916e4d0" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.772143 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pflwb" Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.811184 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:26 crc kubenswrapper[4869]: I0218 05:59:26.834222 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pflwb"] Feb 18 05:59:27 crc kubenswrapper[4869]: I0218 05:59:27.477165 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4272c230-5646-4d29-ba7d-4522410469e8" path="/var/lib/kubelet/pods/4272c230-5646-4d29-ba7d-4522410469e8/volumes" Feb 18 05:59:28 crc kubenswrapper[4869]: I0218 05:59:28.612973 4869 scope.go:117] "RemoveContainer" containerID="cb5492f62cf2e820a8143790dc23a9732127060e8083c9bd1b0e07ed05adaa2c" Feb 18 05:59:28 crc kubenswrapper[4869]: I0218 05:59:28.669887 4869 scope.go:117] "RemoveContainer" containerID="716f8ba6056bf551323e8f38420958f089ca9ddc27bcc5922befd4ae40222eff" Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.796078 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" event={"ID":"74eef01f-c0d7-449c-bca9-7eb78f808110","Type":"ContainerStarted","Data":"9587652e0cf63b36f4566551cbb29ab4e7bfdaeb5dafb86ff17df7e9991138ab"} Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.796154 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.797379 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" event={"ID":"d597c072-fd48-4245-8a9a-5a80aaa78993","Type":"ContainerStarted","Data":"1053528d0e7d4357e921170ea9e3f42bd078db67c79e958bffd78cc776f6da8f"} Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.797495 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.840594 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" podStartSLOduration=2.13567016 podStartE2EDuration="6.840573231s" podCreationTimestamp="2026-02-18 05:59:23 +0000 UTC" firstStartedPulling="2026-02-18 05:59:23.966796276 +0000 UTC m=+661.135884508" lastFinishedPulling="2026-02-18 05:59:28.671699347 +0000 UTC m=+665.840787579" observedRunningTime="2026-02-18 05:59:29.840051847 +0000 UTC m=+667.009140089" watchObservedRunningTime="2026-02-18 05:59:29.840573231 +0000 UTC m=+667.009661463" Feb 18 05:59:29 crc kubenswrapper[4869]: I0218 05:59:29.842895 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" podStartSLOduration=2.181901615 podStartE2EDuration="6.842878646s" podCreationTimestamp="2026-02-18 05:59:23 +0000 UTC" firstStartedPulling="2026-02-18 05:59:24.014786185 +0000 UTC m=+661.183874417" lastFinishedPulling="2026-02-18 05:59:28.675763216 +0000 UTC m=+665.844851448" observedRunningTime="2026-02-18 05:59:29.823002392 +0000 UTC m=+666.992090644" watchObservedRunningTime="2026-02-18 05:59:29.842878646 +0000 UTC m=+667.011966878" Feb 18 05:59:43 crc kubenswrapper[4869]: I0218 05:59:43.731903 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.061153 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 05:59:51 crc kubenswrapper[4869]: E0218 05:59:51.061651 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="extract-utilities" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.061664 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="extract-utilities" Feb 18 05:59:51 crc kubenswrapper[4869]: E0218 05:59:51.061675 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="extract-content" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.061681 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="extract-content" Feb 18 05:59:51 crc kubenswrapper[4869]: E0218 05:59:51.061693 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="registry-server" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.061699 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="registry-server" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.061819 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4272c230-5646-4d29-ba7d-4522410469e8" containerName="registry-server" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.062603 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.083105 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.122132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltlf\" (UniqueName: \"kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.122223 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.122248 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.223900 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.223964 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.224028 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltlf\" (UniqueName: \"kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.224443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.224593 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.253131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltlf\" (UniqueName: \"kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf\") pod \"community-operators-tfzgn\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.377907 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.678475 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.911547 4869 generic.go:334] "Generic (PLEG): container finished" podID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerID="3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae" exitCode=0 Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.911582 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerDied","Data":"3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae"} Feb 18 05:59:51 crc kubenswrapper[4869]: I0218 05:59:51.911606 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerStarted","Data":"ace7704ee5812ea716580f3cdea520ec068111130caba36d70a30a8e6911aa5d"} Feb 18 05:59:52 crc kubenswrapper[4869]: I0218 05:59:52.916993 4869 generic.go:334] "Generic (PLEG): container finished" podID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerID="806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e" exitCode=0 Feb 18 05:59:52 crc kubenswrapper[4869]: I0218 05:59:52.917059 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerDied","Data":"806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e"} Feb 18 05:59:53 crc kubenswrapper[4869]: I0218 05:59:53.925977 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerStarted","Data":"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66"} Feb 18 05:59:53 crc kubenswrapper[4869]: I0218 05:59:53.943533 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tfzgn" podStartSLOduration=1.567068879 podStartE2EDuration="2.943519509s" podCreationTimestamp="2026-02-18 05:59:51 +0000 UTC" firstStartedPulling="2026-02-18 05:59:51.91277343 +0000 UTC m=+689.081861652" lastFinishedPulling="2026-02-18 05:59:53.28922406 +0000 UTC m=+690.458312282" observedRunningTime="2026-02-18 05:59:53.942369551 +0000 UTC m=+691.111457783" watchObservedRunningTime="2026-02-18 05:59:53.943519509 +0000 UTC m=+691.112607741" Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.840031 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.841495 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.850143 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.971532 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lpn\" (UniqueName: \"kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.971579 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:54 crc kubenswrapper[4869]: I0218 05:59:54.971708 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.072512 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lpn\" (UniqueName: \"kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.072563 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.072607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.073051 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.073168 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.095606 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lpn\" (UniqueName: \"kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn\") pod \"redhat-marketplace-hnlhm\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.157942 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.561786 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 05:59:55 crc kubenswrapper[4869]: W0218 05:59:55.567994 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1cf7c5_3ebe_4a49_bb97_61dfd265ef0c.slice/crio-2d05277816964333b37d9848f7cd73a96ff79be20544fcaaa028eb0df81c5cc4 WatchSource:0}: Error finding container 2d05277816964333b37d9848f7cd73a96ff79be20544fcaaa028eb0df81c5cc4: Status 404 returned error can't find the container with id 2d05277816964333b37d9848f7cd73a96ff79be20544fcaaa028eb0df81c5cc4 Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.938524 4869 generic.go:334] "Generic (PLEG): container finished" podID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerID="2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47" exitCode=0 Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.938613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerDied","Data":"2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47"} Feb 18 05:59:55 crc kubenswrapper[4869]: I0218 05:59:55.938848 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerStarted","Data":"2d05277816964333b37d9848f7cd73a96ff79be20544fcaaa028eb0df81c5cc4"} Feb 18 05:59:56 crc kubenswrapper[4869]: I0218 05:59:56.945154 4869 generic.go:334] "Generic (PLEG): container finished" podID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerID="e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4" exitCode=0 Feb 18 05:59:56 crc kubenswrapper[4869]: I0218 05:59:56.945198 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerDied","Data":"e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4"} Feb 18 05:59:58 crc kubenswrapper[4869]: I0218 05:59:58.955930 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerStarted","Data":"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2"} Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.147549 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnlhm" podStartSLOduration=4.075041212 podStartE2EDuration="6.147532097s" podCreationTimestamp="2026-02-18 05:59:54 +0000 UTC" firstStartedPulling="2026-02-18 05:59:55.940490965 +0000 UTC m=+693.109579197" lastFinishedPulling="2026-02-18 05:59:58.01298186 +0000 UTC m=+695.182070082" observedRunningTime="2026-02-18 05:59:58.975195289 +0000 UTC m=+696.144283521" watchObservedRunningTime="2026-02-18 06:00:00.147532097 +0000 UTC m=+697.316620329" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.148679 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb"] Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.149495 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.151992 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.153200 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.156482 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb"] Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.337800 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.337943 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rx2\" (UniqueName: \"kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.338018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.439216 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8rx2\" (UniqueName: \"kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.439282 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.439362 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.440214 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.444761 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.454899 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8rx2\" (UniqueName: \"kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2\") pod \"collect-profiles-29523240-4nxtb\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.470214 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.651317 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb"] Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.966570 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b78c004-29c7-4c37-8eac-826aa3f04eb9" containerID="3af12797c6f08fcd50bee7de7f7d0c55719b2411fd0f91380d165a9d29f04fe1" exitCode=0 Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.966630 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" event={"ID":"9b78c004-29c7-4c37-8eac-826aa3f04eb9","Type":"ContainerDied","Data":"3af12797c6f08fcd50bee7de7f7d0c55719b2411fd0f91380d165a9d29f04fe1"} Feb 18 06:00:00 crc kubenswrapper[4869]: I0218 06:00:00.966848 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" event={"ID":"9b78c004-29c7-4c37-8eac-826aa3f04eb9","Type":"ContainerStarted","Data":"349ca36f9dac1e37957cc05ad0f15e12515f4debdc785bf84236a6568228f824"} Feb 18 06:00:01 crc kubenswrapper[4869]: I0218 06:00:01.378098 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:01 crc kubenswrapper[4869]: I0218 06:00:01.378242 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:01 crc kubenswrapper[4869]: I0218 06:00:01.421049 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.010604 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.304649 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.506426 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8rx2\" (UniqueName: \"kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2\") pod \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.507646 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume\") pod \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.507852 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume\") pod \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\" (UID: \"9b78c004-29c7-4c37-8eac-826aa3f04eb9\") " Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.508209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b78c004-29c7-4c37-8eac-826aa3f04eb9" (UID: "9b78c004-29c7-4c37-8eac-826aa3f04eb9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.509396 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b78c004-29c7-4c37-8eac-826aa3f04eb9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.511997 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2" (OuterVolumeSpecName: "kube-api-access-h8rx2") pod "9b78c004-29c7-4c37-8eac-826aa3f04eb9" (UID: "9b78c004-29c7-4c37-8eac-826aa3f04eb9"). InnerVolumeSpecName "kube-api-access-h8rx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.513812 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b78c004-29c7-4c37-8eac-826aa3f04eb9" (UID: "9b78c004-29c7-4c37-8eac-826aa3f04eb9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.610884 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b78c004-29c7-4c37-8eac-826aa3f04eb9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.610924 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8rx2\" (UniqueName: \"kubernetes.io/projected/9b78c004-29c7-4c37-8eac-826aa3f04eb9-kube-api-access-h8rx2\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.633387 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.977511 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.977504 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb" event={"ID":"9b78c004-29c7-4c37-8eac-826aa3f04eb9","Type":"ContainerDied","Data":"349ca36f9dac1e37957cc05ad0f15e12515f4debdc785bf84236a6568228f824"} Feb 18 06:00:02 crc kubenswrapper[4869]: I0218 06:00:02.977650 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349ca36f9dac1e37957cc05ad0f15e12515f4debdc785bf84236a6568228f824" Feb 18 06:00:03 crc kubenswrapper[4869]: I0218 06:00:03.544658 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-55df77c686-fqtt5" Feb 18 06:00:03 crc kubenswrapper[4869]: I0218 06:00:03.982286 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tfzgn" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="registry-server" containerID="cri-o://32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66" gracePeriod=2 Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.329298 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.351738 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h4mbr"] Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.352049 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="extract-utilities" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352074 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="extract-utilities" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.352094 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="registry-server" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352102 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="registry-server" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.352112 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="extract-content" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352119 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="extract-content" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.352136 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b78c004-29c7-4c37-8eac-826aa3f04eb9" containerName="collect-profiles" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352144 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b78c004-29c7-4c37-8eac-826aa3f04eb9" containerName="collect-profiles" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352273 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerName="registry-server" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.352298 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b78c004-29c7-4c37-8eac-826aa3f04eb9" containerName="collect-profiles" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.354692 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.361618 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.361664 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.361908 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-hrr94" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.364210 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd"] Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.365037 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.370506 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.382267 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd"] Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.434567 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-v8dwx"] Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.435389 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities\") pod \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.435523 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content\") pod \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.435590 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltlf\" (UniqueName: \"kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf\") pod \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\" (UID: \"d0c848a4-7ab3-4e02-8508-12e5c83e09de\") " Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.436980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437052 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-startup\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437170 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gt9j\" (UniqueName: \"kubernetes.io/projected/c67c84c4-b4c3-4336-a6b9-3543258cea17-kube-api-access-4gt9j\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437214 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqck\" (UniqueName: \"kubernetes.io/projected/24b94d41-cd3f-4c37-86a1-0ed957404bab-kube-api-access-8kqck\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437257 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-reloader\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437290 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67c84c4-b4c3-4336-a6b9-3543258cea17-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-conf\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437428 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437467 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-sockets\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.437901 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities" (OuterVolumeSpecName: "utilities") pod "d0c848a4-7ab3-4e02-8508-12e5c83e09de" (UID: "d0c848a4-7ab3-4e02-8508-12e5c83e09de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.449701 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf" (OuterVolumeSpecName: "kube-api-access-hltlf") pod "d0c848a4-7ab3-4e02-8508-12e5c83e09de" (UID: "d0c848a4-7ab3-4e02-8508-12e5c83e09de"). InnerVolumeSpecName "kube-api-access-hltlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.460012 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.473284 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.473516 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.473638 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4skzl" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.475178 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.481900 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-sh2wz"] Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.483547 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.500612 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.500927 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-sh2wz"] Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.530439 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0c848a4-7ab3-4e02-8508-12e5c83e09de" (UID: "d0c848a4-7ab3-4e02-8508-12e5c83e09de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539331 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-reloader\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539385 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67c84c4-b4c3-4336-a6b9-3543258cea17-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-conf\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539432 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539462 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metallb-excludel2\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539487 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-sockets\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539516 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-cert\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539540 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539563 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539588 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-startup\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539682 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gt9j\" (UniqueName: \"kubernetes.io/projected/c67c84c4-b4c3-4336-a6b9-3543258cea17-kube-api-access-4gt9j\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539710 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvck\" (UniqueName: \"kubernetes.io/projected/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-kube-api-access-5mvck\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539722 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-conf\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539757 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcls8\" (UniqueName: \"kubernetes.io/projected/e23bbfea-0160-46be-ae71-7ff977953af2-kube-api-access-qcls8\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539768 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-reloader\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539805 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metrics-certs\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539828 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqck\" (UniqueName: \"kubernetes.io/projected/24b94d41-cd3f-4c37-86a1-0ed957404bab-kube-api-access-8kqck\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.539859 4869 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539881 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539893 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltlf\" (UniqueName: \"kubernetes.io/projected/d0c848a4-7ab3-4e02-8508-12e5c83e09de-kube-api-access-hltlf\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.539906 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs podName:24b94d41-cd3f-4c37-86a1-0ed957404bab nodeName:}" failed. No retries permitted until 2026-02-18 06:00:05.039889625 +0000 UTC m=+702.208977857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs") pod "frr-k8s-h4mbr" (UID: "24b94d41-cd3f-4c37-86a1-0ed957404bab") : secret "frr-k8s-certs-secret" not found Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.539922 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c848a4-7ab3-4e02-8508-12e5c83e09de-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.540149 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-sockets\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.540379 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.541285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24b94d41-cd3f-4c37-86a1-0ed957404bab-frr-startup\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.546933 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67c84c4-b4c3-4336-a6b9-3543258cea17-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.557474 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqck\" (UniqueName: \"kubernetes.io/projected/24b94d41-cd3f-4c37-86a1-0ed957404bab-kube-api-access-8kqck\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.559549 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gt9j\" (UniqueName: \"kubernetes.io/projected/c67c84c4-b4c3-4336-a6b9-3543258cea17-kube-api-access-4gt9j\") pod \"frr-k8s-webhook-server-78b44bf5bb-r6mvd\" (UID: \"c67c84c4-b4c3-4336-a6b9-3543258cea17\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.640890 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.641268 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mvck\" (UniqueName: \"kubernetes.io/projected/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-kube-api-access-5mvck\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.641392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcls8\" (UniqueName: \"kubernetes.io/projected/e23bbfea-0160-46be-ae71-7ff977953af2-kube-api-access-qcls8\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.641129 4869 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.641483 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metrics-certs\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.641835 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metallb-excludel2\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.641969 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-cert\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.642035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.642117 4869 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.642499 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metallb-excludel2\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.642575 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist podName:f63c8d8a-ba54-4ddf-9105-5f886b7984d9 nodeName:}" failed. No retries permitted until 2026-02-18 06:00:05.142559036 +0000 UTC m=+702.311647268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist") pod "speaker-v8dwx" (UID: "f63c8d8a-ba54-4ddf-9105-5f886b7984d9") : secret "metallb-memberlist" not found Feb 18 06:00:04 crc kubenswrapper[4869]: E0218 06:00:04.642591 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs podName:e23bbfea-0160-46be-ae71-7ff977953af2 nodeName:}" failed. No retries permitted until 2026-02-18 06:00:05.142585227 +0000 UTC m=+702.311673449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs") pod "controller-69bbfbf88f-sh2wz" (UID: "e23bbfea-0160-46be-ae71-7ff977953af2") : secret "controller-certs-secret" not found Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.646264 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-cert\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.646312 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-metrics-certs\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.667494 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcls8\" (UniqueName: \"kubernetes.io/projected/e23bbfea-0160-46be-ae71-7ff977953af2-kube-api-access-qcls8\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.674458 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mvck\" (UniqueName: \"kubernetes.io/projected/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-kube-api-access-5mvck\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.688026 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.989233 4869 generic.go:334] "Generic (PLEG): container finished" podID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" containerID="32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66" exitCode=0 Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.989302 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerDied","Data":"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66"} Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.989607 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfzgn" event={"ID":"d0c848a4-7ab3-4e02-8508-12e5c83e09de","Type":"ContainerDied","Data":"ace7704ee5812ea716580f3cdea520ec068111130caba36d70a30a8e6911aa5d"} Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.989633 4869 scope.go:117] "RemoveContainer" containerID="32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66" Feb 18 06:00:04 crc kubenswrapper[4869]: I0218 06:00:04.989337 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfzgn" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.005925 4869 scope.go:117] "RemoveContainer" containerID="806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.022676 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.027031 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tfzgn"] Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.034727 4869 scope.go:117] "RemoveContainer" containerID="3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.045937 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.049807 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24b94d41-cd3f-4c37-86a1-0ed957404bab-metrics-certs\") pod \"frr-k8s-h4mbr\" (UID: \"24b94d41-cd3f-4c37-86a1-0ed957404bab\") " pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.055916 4869 scope.go:117] "RemoveContainer" containerID="32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66" Feb 18 06:00:05 crc kubenswrapper[4869]: E0218 06:00:05.056316 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66\": container with ID starting with 32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66 not found: ID does not exist" containerID="32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.056368 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66"} err="failed to get container status \"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66\": rpc error: code = NotFound desc = could not find container \"32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66\": container with ID starting with 32844ab1e06b97646d2efbe4f34f9c7aa3276db96d3a18b523a45a3549e46c66 not found: ID does not exist" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.056400 4869 scope.go:117] "RemoveContainer" containerID="806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e" Feb 18 06:00:05 crc kubenswrapper[4869]: E0218 06:00:05.056984 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e\": container with ID starting with 806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e not found: ID does not exist" containerID="806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.057023 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e"} err="failed to get container status \"806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e\": rpc error: code = NotFound desc = could not find container \"806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e\": container with ID starting with 806b941898f16aa18d5fb06edb38650dccaaef3565b0422c352e380b4caa843e not found: ID does not exist" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.057048 4869 scope.go:117] "RemoveContainer" containerID="3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae" Feb 18 06:00:05 crc kubenswrapper[4869]: E0218 06:00:05.057407 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae\": container with ID starting with 3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae not found: ID does not exist" containerID="3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.057441 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae"} err="failed to get container status \"3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae\": rpc error: code = NotFound desc = could not find container \"3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae\": container with ID starting with 3d28a417cadf7093721aa0efaea4d6b3df48569b13618d8fb8f53a51a0f715ae not found: ID does not exist" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.095069 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd"] Feb 18 06:00:05 crc kubenswrapper[4869]: W0218 06:00:05.102948 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc67c84c4_b4c3_4336_a6b9_3543258cea17.slice/crio-6f114ccccf9ba9ad64cc643448e8e688b3dbfb9f49879c48c17c21038a1804e0 WatchSource:0}: Error finding container 6f114ccccf9ba9ad64cc643448e8e688b3dbfb9f49879c48c17c21038a1804e0: Status 404 returned error can't find the container with id 6f114ccccf9ba9ad64cc643448e8e688b3dbfb9f49879c48c17c21038a1804e0 Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.146810 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.146896 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:05 crc kubenswrapper[4869]: E0218 06:00:05.147086 4869 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 06:00:05 crc kubenswrapper[4869]: E0218 06:00:05.147178 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist podName:f63c8d8a-ba54-4ddf-9105-5f886b7984d9 nodeName:}" failed. No retries permitted until 2026-02-18 06:00:06.147143838 +0000 UTC m=+703.316232060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist") pod "speaker-v8dwx" (UID: "f63c8d8a-ba54-4ddf-9105-5f886b7984d9") : secret "metallb-memberlist" not found Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.150463 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e23bbfea-0160-46be-ae71-7ff977953af2-metrics-certs\") pod \"controller-69bbfbf88f-sh2wz\" (UID: \"e23bbfea-0160-46be-ae71-7ff977953af2\") " pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.158711 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.158793 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.200946 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.272046 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.417203 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.483030 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c848a4-7ab3-4e02-8508-12e5c83e09de" path="/var/lib/kubelet/pods/d0c848a4-7ab3-4e02-8508-12e5c83e09de/volumes" Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.853036 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-sh2wz"] Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.994624 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"6436ba992f96411cdebd51d14538b23a0a5e8641619565a2196dcd443d35dfdf"} Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.996413 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" event={"ID":"c67c84c4-b4c3-4336-a6b9-3543258cea17","Type":"ContainerStarted","Data":"6f114ccccf9ba9ad64cc643448e8e688b3dbfb9f49879c48c17c21038a1804e0"} Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.997845 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sh2wz" event={"ID":"e23bbfea-0160-46be-ae71-7ff977953af2","Type":"ContainerStarted","Data":"a111314f14f27d503d78b05ea251e92d9fd9bba71dcac08c5262f78b009c5c6e"} Feb 18 06:00:05 crc kubenswrapper[4869]: I0218 06:00:05.997878 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sh2wz" event={"ID":"e23bbfea-0160-46be-ae71-7ff977953af2","Type":"ContainerStarted","Data":"0cd2742e2a8fcacfaee267460ae9b997f9eebf65682f9ba9190744bc1397a525"} Feb 18 06:00:06 crc kubenswrapper[4869]: I0218 06:00:06.049189 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:06 crc kubenswrapper[4869]: I0218 06:00:06.159886 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:06 crc kubenswrapper[4869]: I0218 06:00:06.172557 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f63c8d8a-ba54-4ddf-9105-5f886b7984d9-memberlist\") pod \"speaker-v8dwx\" (UID: \"f63c8d8a-ba54-4ddf-9105-5f886b7984d9\") " pod="metallb-system/speaker-v8dwx" Feb 18 06:00:06 crc kubenswrapper[4869]: I0218 06:00:06.308615 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v8dwx" Feb 18 06:00:06 crc kubenswrapper[4869]: W0218 06:00:06.337333 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63c8d8a_ba54_4ddf_9105_5f886b7984d9.slice/crio-f79ca24ed165278b00fc58ca51c1445a3abfa9b6f63de5af27ea3520963fb309 WatchSource:0}: Error finding container f79ca24ed165278b00fc58ca51c1445a3abfa9b6f63de5af27ea3520963fb309: Status 404 returned error can't find the container with id f79ca24ed165278b00fc58ca51c1445a3abfa9b6f63de5af27ea3520963fb309 Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.005182 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sh2wz" event={"ID":"e23bbfea-0160-46be-ae71-7ff977953af2","Type":"ContainerStarted","Data":"8a7132879199a7f81777c61dd7627701974777625d288523fbdd2b5727c0085f"} Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.005852 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.034411 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-sh2wz" podStartSLOduration=3.034395071 podStartE2EDuration="3.034395071s" podCreationTimestamp="2026-02-18 06:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:00:07.033172211 +0000 UTC m=+704.202260443" watchObservedRunningTime="2026-02-18 06:00:07.034395071 +0000 UTC m=+704.203483303" Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.041381 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v8dwx" event={"ID":"f63c8d8a-ba54-4ddf-9105-5f886b7984d9","Type":"ContainerStarted","Data":"eda047005e4ed32906464c6afd0bf010c8e8c44530cd88016f5420db77a3deb3"} Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.041420 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v8dwx" event={"ID":"f63c8d8a-ba54-4ddf-9105-5f886b7984d9","Type":"ContainerStarted","Data":"5aa04c4887d7b68d5180a89a0b3689c1cf495696758057dfd2dd1bb165152896"} Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.041430 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v8dwx" event={"ID":"f63c8d8a-ba54-4ddf-9105-5f886b7984d9","Type":"ContainerStarted","Data":"f79ca24ed165278b00fc58ca51c1445a3abfa9b6f63de5af27ea3520963fb309"} Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.042023 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-v8dwx" Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.067781 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-v8dwx" podStartSLOduration=3.067765134 podStartE2EDuration="3.067765134s" podCreationTimestamp="2026-02-18 06:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:00:07.065525429 +0000 UTC m=+704.234613661" watchObservedRunningTime="2026-02-18 06:00:07.067765134 +0000 UTC m=+704.236853356" Feb 18 06:00:07 crc kubenswrapper[4869]: I0218 06:00:07.433702 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.047371 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnlhm" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="registry-server" containerID="cri-o://24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2" gracePeriod=2 Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.503470 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.597771 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2lpn\" (UniqueName: \"kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn\") pod \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.597822 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities\") pod \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.597862 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content\") pod \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\" (UID: \"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c\") " Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.598883 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities" (OuterVolumeSpecName: "utilities") pod "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" (UID: "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.603774 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn" (OuterVolumeSpecName: "kube-api-access-s2lpn") pod "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" (UID: "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c"). InnerVolumeSpecName "kube-api-access-s2lpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.639652 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" (UID: "4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.699989 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2lpn\" (UniqueName: \"kubernetes.io/projected/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-kube-api-access-s2lpn\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.700027 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:08 crc kubenswrapper[4869]: I0218 06:00:08.700039 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.057030 4869 generic.go:334] "Generic (PLEG): container finished" podID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerID="24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2" exitCode=0 Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.057088 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerDied","Data":"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2"} Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.057116 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnlhm" event={"ID":"4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c","Type":"ContainerDied","Data":"2d05277816964333b37d9848f7cd73a96ff79be20544fcaaa028eb0df81c5cc4"} Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.057136 4869 scope.go:117] "RemoveContainer" containerID="24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.057234 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnlhm" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.074155 4869 scope.go:117] "RemoveContainer" containerID="e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.094232 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.098665 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnlhm"] Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.105185 4869 scope.go:117] "RemoveContainer" containerID="2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.123365 4869 scope.go:117] "RemoveContainer" containerID="24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2" Feb 18 06:00:09 crc kubenswrapper[4869]: E0218 06:00:09.123761 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2\": container with ID starting with 24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2 not found: ID does not exist" containerID="24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.123798 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2"} err="failed to get container status \"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2\": rpc error: code = NotFound desc = could not find container \"24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2\": container with ID starting with 24f65ab07da090311268ccb59346a764e5744b15d1f99e504d493e1bcc4c8db2 not found: ID does not exist" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.123824 4869 scope.go:117] "RemoveContainer" containerID="e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4" Feb 18 06:00:09 crc kubenswrapper[4869]: E0218 06:00:09.124106 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4\": container with ID starting with e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4 not found: ID does not exist" containerID="e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.124134 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4"} err="failed to get container status \"e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4\": rpc error: code = NotFound desc = could not find container \"e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4\": container with ID starting with e691b898c7cda94bba11b6eaff8f97af512a0557925187459f835282331120a4 not found: ID does not exist" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.124151 4869 scope.go:117] "RemoveContainer" containerID="2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47" Feb 18 06:00:09 crc kubenswrapper[4869]: E0218 06:00:09.124426 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47\": container with ID starting with 2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47 not found: ID does not exist" containerID="2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.124456 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47"} err="failed to get container status \"2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47\": rpc error: code = NotFound desc = could not find container \"2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47\": container with ID starting with 2687aabaa10fbe9916647c606b8f30ed8fa597ecc813f0d84af94442dee35e47 not found: ID does not exist" Feb 18 06:00:09 crc kubenswrapper[4869]: I0218 06:00:09.481981 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" path="/var/lib/kubelet/pods/4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c/volumes" Feb 18 06:00:10 crc kubenswrapper[4869]: I0218 06:00:10.133275 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:00:10 crc kubenswrapper[4869]: I0218 06:00:10.133610 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:00:12 crc kubenswrapper[4869]: E0218 06:00:12.444719 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b94d41_cd3f_4c37_86a1_0ed957404bab.slice/crio-conmon-f466738af35863185f9dda1e352b84c032bb9e4a575c65957034296db863e720.scope\": RecentStats: unable to find data in memory cache]" Feb 18 06:00:13 crc kubenswrapper[4869]: I0218 06:00:13.097440 4869 generic.go:334] "Generic (PLEG): container finished" podID="24b94d41-cd3f-4c37-86a1-0ed957404bab" containerID="f466738af35863185f9dda1e352b84c032bb9e4a575c65957034296db863e720" exitCode=0 Feb 18 06:00:13 crc kubenswrapper[4869]: I0218 06:00:13.097538 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerDied","Data":"f466738af35863185f9dda1e352b84c032bb9e4a575c65957034296db863e720"} Feb 18 06:00:13 crc kubenswrapper[4869]: I0218 06:00:13.099700 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" event={"ID":"c67c84c4-b4c3-4336-a6b9-3543258cea17","Type":"ContainerStarted","Data":"04c76077f7423e56adcf7e73537a79dcc70b6fafbdb0b81ed794ef0d5425f478"} Feb 18 06:00:13 crc kubenswrapper[4869]: I0218 06:00:13.099885 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:13 crc kubenswrapper[4869]: I0218 06:00:13.145338 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" podStartSLOduration=2.024235095 podStartE2EDuration="9.145314793s" podCreationTimestamp="2026-02-18 06:00:04 +0000 UTC" firstStartedPulling="2026-02-18 06:00:05.105523174 +0000 UTC m=+702.274611406" lastFinishedPulling="2026-02-18 06:00:12.226602872 +0000 UTC m=+709.395691104" observedRunningTime="2026-02-18 06:00:13.141370736 +0000 UTC m=+710.310458968" watchObservedRunningTime="2026-02-18 06:00:13.145314793 +0000 UTC m=+710.314403025" Feb 18 06:00:14 crc kubenswrapper[4869]: I0218 06:00:14.106085 4869 generic.go:334] "Generic (PLEG): container finished" podID="24b94d41-cd3f-4c37-86a1-0ed957404bab" containerID="a732feddabd164d056aacda9579ae2813e63d6042882d826a29803b7fafb9b0d" exitCode=0 Feb 18 06:00:14 crc kubenswrapper[4869]: I0218 06:00:14.106197 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerDied","Data":"a732feddabd164d056aacda9579ae2813e63d6042882d826a29803b7fafb9b0d"} Feb 18 06:00:15 crc kubenswrapper[4869]: I0218 06:00:15.112505 4869 generic.go:334] "Generic (PLEG): container finished" podID="24b94d41-cd3f-4c37-86a1-0ed957404bab" containerID="8742d224f074538e193d86982ed63b3d2f6d98b6d3fa995ab3ed4d216456e10f" exitCode=0 Feb 18 06:00:15 crc kubenswrapper[4869]: I0218 06:00:15.112796 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerDied","Data":"8742d224f074538e193d86982ed63b3d2f6d98b6d3fa995ab3ed4d216456e10f"} Feb 18 06:00:15 crc kubenswrapper[4869]: I0218 06:00:15.422813 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-sh2wz" Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139093 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"563a2427ca75bf5d53424c5c1cd8f19798b56c92329dcb81fe6095002e3fd52b"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"92bd51c4eb3dc5d8918b018299ec07085da452549be0b89b73e958520bdc632c"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139484 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"7f067cf7c20c4e13e061bde770e80a97afde6dc65b6b87f877e30f51feb34ee1"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139494 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"eebc362fe5838b1b9fdedd1a899985dbad15441eeb18df0455442e06ba27e778"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139507 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"790a365338684bb84706597d82d7f269aa74ed24ad982fb6d69654b0cdd72e21"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h4mbr" event={"ID":"24b94d41-cd3f-4c37-86a1-0ed957404bab","Type":"ContainerStarted","Data":"3b1aeac953e28ef11ac666fcbe03f1a6a839bfea46adeb230f8a07773d3e5d67"} Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.139839 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.169639 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h4mbr" podStartSLOduration=5.36606051 podStartE2EDuration="12.169621924s" podCreationTimestamp="2026-02-18 06:00:04 +0000 UTC" firstStartedPulling="2026-02-18 06:00:05.39354176 +0000 UTC m=+702.562629992" lastFinishedPulling="2026-02-18 06:00:12.197103174 +0000 UTC m=+709.366191406" observedRunningTime="2026-02-18 06:00:16.16782661 +0000 UTC m=+713.336914842" watchObservedRunningTime="2026-02-18 06:00:16.169621924 +0000 UTC m=+713.338710156" Feb 18 06:00:16 crc kubenswrapper[4869]: I0218 06:00:16.313258 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-v8dwx" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.174233 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:19 crc kubenswrapper[4869]: E0218 06:00:19.174720 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="registry-server" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.174731 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="registry-server" Feb 18 06:00:19 crc kubenswrapper[4869]: E0218 06:00:19.174757 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="extract-content" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.174763 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="extract-content" Feb 18 06:00:19 crc kubenswrapper[4869]: E0218 06:00:19.174775 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="extract-utilities" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.174781 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="extract-utilities" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.174886 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1cf7c5-3ebe-4a49-bb97-61dfd265ef0c" containerName="registry-server" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.175269 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.179482 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.182654 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.193841 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dfbg5" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.234077 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.251155 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ss7z\" (UniqueName: \"kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z\") pod \"openstack-operator-index-qb52t\" (UID: \"e1901f4d-5820-4835-a5b7-c065732b8946\") " pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.352102 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ss7z\" (UniqueName: \"kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z\") pod \"openstack-operator-index-qb52t\" (UID: \"e1901f4d-5820-4835-a5b7-c065732b8946\") " pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.381943 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ss7z\" (UniqueName: \"kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z\") pod \"openstack-operator-index-qb52t\" (UID: \"e1901f4d-5820-4835-a5b7-c065732b8946\") " pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.493636 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:19 crc kubenswrapper[4869]: I0218 06:00:19.898265 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:20 crc kubenswrapper[4869]: I0218 06:00:20.164041 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qb52t" event={"ID":"e1901f4d-5820-4835-a5b7-c065732b8946","Type":"ContainerStarted","Data":"6189aff71a3672b853140098af3b6ba329ce913987748dfc96b424d274d223af"} Feb 18 06:00:20 crc kubenswrapper[4869]: I0218 06:00:20.273282 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:20 crc kubenswrapper[4869]: I0218 06:00:20.310903 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:22 crc kubenswrapper[4869]: I0218 06:00:22.177716 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qb52t" event={"ID":"e1901f4d-5820-4835-a5b7-c065732b8946","Type":"ContainerStarted","Data":"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829"} Feb 18 06:00:22 crc kubenswrapper[4869]: I0218 06:00:22.195615 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qb52t" podStartSLOduration=1.332948418 podStartE2EDuration="3.195592679s" podCreationTimestamp="2026-02-18 06:00:19 +0000 UTC" firstStartedPulling="2026-02-18 06:00:19.915449093 +0000 UTC m=+717.084537325" lastFinishedPulling="2026-02-18 06:00:21.778093354 +0000 UTC m=+718.947181586" observedRunningTime="2026-02-18 06:00:22.191070398 +0000 UTC m=+719.360158630" watchObservedRunningTime="2026-02-18 06:00:22.195592679 +0000 UTC m=+719.364680921" Feb 18 06:00:22 crc kubenswrapper[4869]: I0218 06:00:22.957448 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.749990 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-crznf"] Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.750960 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.758822 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-crznf"] Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.817945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptff\" (UniqueName: \"kubernetes.io/projected/5f3e331a-a2a6-4bd2-adce-f586154b805c-kube-api-access-gptff\") pod \"openstack-operator-index-crznf\" (UID: \"5f3e331a-a2a6-4bd2-adce-f586154b805c\") " pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.919893 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gptff\" (UniqueName: \"kubernetes.io/projected/5f3e331a-a2a6-4bd2-adce-f586154b805c-kube-api-access-gptff\") pod \"openstack-operator-index-crznf\" (UID: \"5f3e331a-a2a6-4bd2-adce-f586154b805c\") " pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:23 crc kubenswrapper[4869]: I0218 06:00:23.941550 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptff\" (UniqueName: \"kubernetes.io/projected/5f3e331a-a2a6-4bd2-adce-f586154b805c-kube-api-access-gptff\") pod \"openstack-operator-index-crznf\" (UID: \"5f3e331a-a2a6-4bd2-adce-f586154b805c\") " pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.104957 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.193090 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qb52t" podUID="e1901f4d-5820-4835-a5b7-c065732b8946" containerName="registry-server" containerID="cri-o://714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829" gracePeriod=2 Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.526262 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.529816 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-crznf"] Feb 18 06:00:24 crc kubenswrapper[4869]: W0218 06:00:24.538015 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3e331a_a2a6_4bd2_adce_f586154b805c.slice/crio-7331199210ec4c8bae3aa5625b4bd198d54c4abe8ee7156ac1ea3baa095ffe2d WatchSource:0}: Error finding container 7331199210ec4c8bae3aa5625b4bd198d54c4abe8ee7156ac1ea3baa095ffe2d: Status 404 returned error can't find the container with id 7331199210ec4c8bae3aa5625b4bd198d54c4abe8ee7156ac1ea3baa095ffe2d Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.629655 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ss7z\" (UniqueName: \"kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z\") pod \"e1901f4d-5820-4835-a5b7-c065732b8946\" (UID: \"e1901f4d-5820-4835-a5b7-c065732b8946\") " Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.635209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z" (OuterVolumeSpecName: "kube-api-access-9ss7z") pod "e1901f4d-5820-4835-a5b7-c065732b8946" (UID: "e1901f4d-5820-4835-a5b7-c065732b8946"). InnerVolumeSpecName "kube-api-access-9ss7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.695001 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-r6mvd" Feb 18 06:00:24 crc kubenswrapper[4869]: I0218 06:00:24.731926 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ss7z\" (UniqueName: \"kubernetes.io/projected/e1901f4d-5820-4835-a5b7-c065732b8946-kube-api-access-9ss7z\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.201152 4869 generic.go:334] "Generic (PLEG): container finished" podID="e1901f4d-5820-4835-a5b7-c065732b8946" containerID="714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829" exitCode=0 Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.201217 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qb52t" event={"ID":"e1901f4d-5820-4835-a5b7-c065732b8946","Type":"ContainerDied","Data":"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829"} Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.201243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qb52t" event={"ID":"e1901f4d-5820-4835-a5b7-c065732b8946","Type":"ContainerDied","Data":"6189aff71a3672b853140098af3b6ba329ce913987748dfc96b424d274d223af"} Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.201247 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qb52t" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.201263 4869 scope.go:117] "RemoveContainer" containerID="714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.204395 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-crznf" event={"ID":"5f3e331a-a2a6-4bd2-adce-f586154b805c","Type":"ContainerStarted","Data":"f5ce17cc329ebafd1bb458719764f3f91103560885f5dc6695f0d906623f338c"} Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.204438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-crznf" event={"ID":"5f3e331a-a2a6-4bd2-adce-f586154b805c","Type":"ContainerStarted","Data":"7331199210ec4c8bae3aa5625b4bd198d54c4abe8ee7156ac1ea3baa095ffe2d"} Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.227280 4869 scope.go:117] "RemoveContainer" containerID="714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829" Feb 18 06:00:25 crc kubenswrapper[4869]: E0218 06:00:25.227920 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829\": container with ID starting with 714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829 not found: ID does not exist" containerID="714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.227958 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829"} err="failed to get container status \"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829\": rpc error: code = NotFound desc = could not find container \"714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829\": container with ID starting with 714e5a8ef61821f50bc8b2184ad8846f88473e3cb3ae0fb0c34c102cc31e1829 not found: ID does not exist" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.228022 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-crznf" podStartSLOduration=2.18143735 podStartE2EDuration="2.227994059s" podCreationTimestamp="2026-02-18 06:00:23 +0000 UTC" firstStartedPulling="2026-02-18 06:00:24.543939374 +0000 UTC m=+721.713027606" lastFinishedPulling="2026-02-18 06:00:24.590496073 +0000 UTC m=+721.759584315" observedRunningTime="2026-02-18 06:00:25.223653873 +0000 UTC m=+722.392742105" watchObservedRunningTime="2026-02-18 06:00:25.227994059 +0000 UTC m=+722.397082321" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.249524 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.255487 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qb52t"] Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.275928 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h4mbr" Feb 18 06:00:25 crc kubenswrapper[4869]: I0218 06:00:25.488304 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1901f4d-5820-4835-a5b7-c065732b8946" path="/var/lib/kubelet/pods/e1901f4d-5820-4835-a5b7-c065732b8946/volumes" Feb 18 06:00:34 crc kubenswrapper[4869]: I0218 06:00:34.106061 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:34 crc kubenswrapper[4869]: I0218 06:00:34.106613 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:34 crc kubenswrapper[4869]: I0218 06:00:34.136101 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:34 crc kubenswrapper[4869]: I0218 06:00:34.274663 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-crznf" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.188257 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4"] Feb 18 06:00:37 crc kubenswrapper[4869]: E0218 06:00:37.188800 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1901f4d-5820-4835-a5b7-c065732b8946" containerName="registry-server" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.188815 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1901f4d-5820-4835-a5b7-c065732b8946" containerName="registry-server" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.188957 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1901f4d-5820-4835-a5b7-c065732b8946" containerName="registry-server" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.189778 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.192205 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2kmhf" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.206207 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4"] Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.316954 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.317162 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.317213 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fzpm\" (UniqueName: \"kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.419658 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.419784 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fzpm\" (UniqueName: \"kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.419923 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.420600 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.420732 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.445406 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fzpm\" (UniqueName: \"kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:37 crc kubenswrapper[4869]: I0218 06:00:37.546012 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:38 crc kubenswrapper[4869]: I0218 06:00:38.003910 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4"] Feb 18 06:00:38 crc kubenswrapper[4869]: W0218 06:00:38.013759 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65bf9b8_ddf3_4a68_8dfd_fa484987b27b.slice/crio-3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc WatchSource:0}: Error finding container 3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc: Status 404 returned error can't find the container with id 3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc Feb 18 06:00:38 crc kubenswrapper[4869]: I0218 06:00:38.285290 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerStarted","Data":"ab7939c39740fc0420e296ae667f9cf63b9832c84c14bf4a9f5f934f06356b55"} Feb 18 06:00:38 crc kubenswrapper[4869]: I0218 06:00:38.285346 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerStarted","Data":"3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc"} Feb 18 06:00:39 crc kubenswrapper[4869]: I0218 06:00:39.297705 4869 generic.go:334] "Generic (PLEG): container finished" podID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerID="ab7939c39740fc0420e296ae667f9cf63b9832c84c14bf4a9f5f934f06356b55" exitCode=0 Feb 18 06:00:39 crc kubenswrapper[4869]: I0218 06:00:39.297902 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerDied","Data":"ab7939c39740fc0420e296ae667f9cf63b9832c84c14bf4a9f5f934f06356b55"} Feb 18 06:00:40 crc kubenswrapper[4869]: I0218 06:00:40.133235 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:00:40 crc kubenswrapper[4869]: I0218 06:00:40.133314 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:00:40 crc kubenswrapper[4869]: I0218 06:00:40.308708 4869 generic.go:334] "Generic (PLEG): container finished" podID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerID="3cf499d01156c9552df5db1f4d47dd89ef6af2eabbff86c4d3da0b79ad477243" exitCode=0 Feb 18 06:00:40 crc kubenswrapper[4869]: I0218 06:00:40.308764 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerDied","Data":"3cf499d01156c9552df5db1f4d47dd89ef6af2eabbff86c4d3da0b79ad477243"} Feb 18 06:00:41 crc kubenswrapper[4869]: I0218 06:00:41.320387 4869 generic.go:334] "Generic (PLEG): container finished" podID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerID="ddb8ab13e862754c948c47f07b4d0b59b4b1ff208ad26f54dfdaceb1fa2650a5" exitCode=0 Feb 18 06:00:41 crc kubenswrapper[4869]: I0218 06:00:41.320443 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerDied","Data":"ddb8ab13e862754c948c47f07b4d0b59b4b1ff208ad26f54dfdaceb1fa2650a5"} Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.569807 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.705139 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fzpm\" (UniqueName: \"kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm\") pod \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.705196 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle\") pod \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.705274 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util\") pod \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\" (UID: \"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b\") " Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.712193 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle" (OuterVolumeSpecName: "bundle") pod "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" (UID: "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.726966 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm" (OuterVolumeSpecName: "kube-api-access-9fzpm") pod "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" (UID: "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b"). InnerVolumeSpecName "kube-api-access-9fzpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.743818 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util" (OuterVolumeSpecName: "util") pod "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" (UID: "a65bf9b8-ddf3-4a68-8dfd-fa484987b27b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.806107 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fzpm\" (UniqueName: \"kubernetes.io/projected/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-kube-api-access-9fzpm\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.806136 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:42 crc kubenswrapper[4869]: I0218 06:00:42.806145 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a65bf9b8-ddf3-4a68-8dfd-fa484987b27b-util\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:43 crc kubenswrapper[4869]: I0218 06:00:43.338609 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" event={"ID":"a65bf9b8-ddf3-4a68-8dfd-fa484987b27b","Type":"ContainerDied","Data":"3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc"} Feb 18 06:00:43 crc kubenswrapper[4869]: I0218 06:00:43.338658 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4d1161c2a9ba69630a6125ba524b88e0c1a9fd80591fb61ecd3fc609dbdacc" Feb 18 06:00:43 crc kubenswrapper[4869]: I0218 06:00:43.338835 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.235169 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x"] Feb 18 06:00:49 crc kubenswrapper[4869]: E0218 06:00:49.236024 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="util" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.236038 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="util" Feb 18 06:00:49 crc kubenswrapper[4869]: E0218 06:00:49.236058 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="extract" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.236064 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="extract" Feb 18 06:00:49 crc kubenswrapper[4869]: E0218 06:00:49.236072 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="pull" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.236078 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="pull" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.236183 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65bf9b8-ddf3-4a68-8dfd-fa484987b27b" containerName="extract" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.236985 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.240343 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-557jx" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.257411 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x"] Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.289363 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xtg\" (UniqueName: \"kubernetes.io/projected/a795b61b-e61c-46e5-a72e-e64ca8421756-kube-api-access-w9xtg\") pod \"openstack-operator-controller-init-766dc4fc6-rtp2x\" (UID: \"a795b61b-e61c-46e5-a72e-e64ca8421756\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.391052 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xtg\" (UniqueName: \"kubernetes.io/projected/a795b61b-e61c-46e5-a72e-e64ca8421756-kube-api-access-w9xtg\") pod \"openstack-operator-controller-init-766dc4fc6-rtp2x\" (UID: \"a795b61b-e61c-46e5-a72e-e64ca8421756\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.412563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xtg\" (UniqueName: \"kubernetes.io/projected/a795b61b-e61c-46e5-a72e-e64ca8421756-kube-api-access-w9xtg\") pod \"openstack-operator-controller-init-766dc4fc6-rtp2x\" (UID: \"a795b61b-e61c-46e5-a72e-e64ca8421756\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.560027 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:49 crc kubenswrapper[4869]: I0218 06:00:49.839772 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x"] Feb 18 06:00:50 crc kubenswrapper[4869]: I0218 06:00:50.388516 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" event={"ID":"a795b61b-e61c-46e5-a72e-e64ca8421756","Type":"ContainerStarted","Data":"364538e210af60802bc91ccefe5412c7524a11bdf006684b2629859c12c07d33"} Feb 18 06:00:54 crc kubenswrapper[4869]: I0218 06:00:54.414989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" event={"ID":"a795b61b-e61c-46e5-a72e-e64ca8421756","Type":"ContainerStarted","Data":"27e5336b0ce8b7d4ec86261f0b7e987a3a193b1581d93c76c058945e15fce9a8"} Feb 18 06:00:54 crc kubenswrapper[4869]: I0218 06:00:54.415356 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:00:54 crc kubenswrapper[4869]: I0218 06:00:54.449150 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" podStartSLOduration=1.836704691 podStartE2EDuration="5.449132853s" podCreationTimestamp="2026-02-18 06:00:49 +0000 UTC" firstStartedPulling="2026-02-18 06:00:49.846851423 +0000 UTC m=+747.015939655" lastFinishedPulling="2026-02-18 06:00:53.459279585 +0000 UTC m=+750.628367817" observedRunningTime="2026-02-18 06:00:54.448836626 +0000 UTC m=+751.617924868" watchObservedRunningTime="2026-02-18 06:00:54.449132853 +0000 UTC m=+751.618221085" Feb 18 06:00:59 crc kubenswrapper[4869]: I0218 06:00:59.562978 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-rtp2x" Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.133553 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.134237 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.134295 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.134972 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.135039 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9" gracePeriod=600 Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.516540 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9" exitCode=0 Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.516611 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9"} Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.516885 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca"} Feb 18 06:01:10 crc kubenswrapper[4869]: I0218 06:01:10.516908 4869 scope.go:117] "RemoveContainer" containerID="9af2aed1a265c6c4127223a14b7d2dfbeb17faca5aaf7f8066c5e58e1ab7d105" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.131185 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.132707 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.135184 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-clzgk" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.137023 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.138039 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.146758 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ttl5h" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.151458 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.152541 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.155256 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qbfcg" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.163691 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.169522 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.183909 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.184893 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.186703 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jcprw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.188326 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tngq\" (UniqueName: \"kubernetes.io/projected/c1a14efa-4a9b-49b6-a882-c0d080269850-kube-api-access-9tngq\") pod \"barbican-operator-controller-manager-868647ff47-q5gnw\" (UID: \"c1a14efa-4a9b-49b6-a882-c0d080269850\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.188376 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8gcz\" (UniqueName: \"kubernetes.io/projected/e5345d15-54e7-4c42-92d2-e3f4d63e9533-kube-api-access-n8gcz\") pod \"cinder-operator-controller-manager-5d946d989d-knzcc\" (UID: \"e5345d15-54e7-4c42-92d2-e3f4d63e9533\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.191969 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.192760 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.196880 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-r9ccx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.214322 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.215406 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.217097 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.229175 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn2vp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.235491 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.236332 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.240631 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mh7pr" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.254017 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.270863 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.277647 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.282794 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.283535 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289352 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2r9\" (UniqueName: \"kubernetes.io/projected/65820ad0-cf24-499c-b418-8980edb8788a-kube-api-access-vg2r9\") pod \"heat-operator-controller-manager-69f49c598c-s9l2b\" (UID: \"65820ad0-cf24-499c-b418-8980edb8788a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289412 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2q68\" (UniqueName: \"kubernetes.io/projected/ea6c026b-8825-42ec-8b66-9c2842957c10-kube-api-access-k2q68\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289431 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjpsd\" (UniqueName: \"kubernetes.io/projected/02aea0c3-b59c-41dd-9c48-514fd4bfa94c-kube-api-access-hjpsd\") pod \"glance-operator-controller-manager-77987464f4-gfxvp\" (UID: \"02aea0c3-b59c-41dd-9c48-514fd4bfa94c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289465 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hss\" (UniqueName: \"kubernetes.io/projected/4a638516-be5b-4a24-9d1a-cc5dbcaac3ed-kube-api-access-b9hss\") pod \"horizon-operator-controller-manager-5b9b8895d5-n5zdw\" (UID: \"4a638516-be5b-4a24-9d1a-cc5dbcaac3ed\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djd66\" (UniqueName: \"kubernetes.io/projected/77f20e81-cc4d-44ab-9f77-40080cc392ec-kube-api-access-djd66\") pod \"designate-operator-controller-manager-6d8bf5c495-8gxgm\" (UID: \"77f20e81-cc4d-44ab-9f77-40080cc392ec\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289535 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tngq\" (UniqueName: \"kubernetes.io/projected/c1a14efa-4a9b-49b6-a882-c0d080269850-kube-api-access-9tngq\") pod \"barbican-operator-controller-manager-868647ff47-q5gnw\" (UID: \"c1a14efa-4a9b-49b6-a882-c0d080269850\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8gcz\" (UniqueName: \"kubernetes.io/projected/e5345d15-54e7-4c42-92d2-e3f4d63e9533-kube-api-access-n8gcz\") pod \"cinder-operator-controller-manager-5d946d989d-knzcc\" (UID: \"e5345d15-54e7-4c42-92d2-e3f4d63e9533\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.289580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.295828 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.296325 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mqqpq" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.312858 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.314145 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.327091 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.332879 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tngq\" (UniqueName: \"kubernetes.io/projected/c1a14efa-4a9b-49b6-a882-c0d080269850-kube-api-access-9tngq\") pod \"barbican-operator-controller-manager-868647ff47-q5gnw\" (UID: \"c1a14efa-4a9b-49b6-a882-c0d080269850\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.334175 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vv9xc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.372549 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8gcz\" (UniqueName: \"kubernetes.io/projected/e5345d15-54e7-4c42-92d2-e3f4d63e9533-kube-api-access-n8gcz\") pod \"cinder-operator-controller-manager-5d946d989d-knzcc\" (UID: \"e5345d15-54e7-4c42-92d2-e3f4d63e9533\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.373350 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.380234 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b7kc\" (UniqueName: \"kubernetes.io/projected/3519f676-e828-4ec9-8995-ecf778e36d4f-kube-api-access-6b7kc\") pod \"keystone-operator-controller-manager-b4d948c87-btm9t\" (UID: \"3519f676-e828-4ec9-8995-ecf778e36d4f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396572 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2q68\" (UniqueName: \"kubernetes.io/projected/ea6c026b-8825-42ec-8b66-9c2842957c10-kube-api-access-k2q68\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjpsd\" (UniqueName: \"kubernetes.io/projected/02aea0c3-b59c-41dd-9c48-514fd4bfa94c-kube-api-access-hjpsd\") pod \"glance-operator-controller-manager-77987464f4-gfxvp\" (UID: \"02aea0c3-b59c-41dd-9c48-514fd4bfa94c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396630 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67s8v\" (UniqueName: \"kubernetes.io/projected/da28bc19-c4b2-4d9a-8357-6ce9680567ce-kube-api-access-67s8v\") pod \"ironic-operator-controller-manager-554564d7fc-55txz\" (UID: \"da28bc19-c4b2-4d9a-8357-6ce9680567ce\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396652 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hss\" (UniqueName: \"kubernetes.io/projected/4a638516-be5b-4a24-9d1a-cc5dbcaac3ed-kube-api-access-b9hss\") pod \"horizon-operator-controller-manager-5b9b8895d5-n5zdw\" (UID: \"4a638516-be5b-4a24-9d1a-cc5dbcaac3ed\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd66\" (UniqueName: \"kubernetes.io/projected/77f20e81-cc4d-44ab-9f77-40080cc392ec-kube-api-access-djd66\") pod \"designate-operator-controller-manager-6d8bf5c495-8gxgm\" (UID: \"77f20e81-cc4d-44ab-9f77-40080cc392ec\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396706 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.396758 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2r9\" (UniqueName: \"kubernetes.io/projected/65820ad0-cf24-499c-b418-8980edb8788a-kube-api-access-vg2r9\") pod \"heat-operator-controller-manager-69f49c598c-s9l2b\" (UID: \"65820ad0-cf24-499c-b418-8980edb8788a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.397156 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.397200 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:19.897186157 +0000 UTC m=+777.066274389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.397844 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.398641 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.406242 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-464b5" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.406686 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7rs54" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.436536 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.445851 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.446534 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2q68\" (UniqueName: \"kubernetes.io/projected/ea6c026b-8825-42ec-8b66-9c2842957c10-kube-api-access-k2q68\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.447383 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2r9\" (UniqueName: \"kubernetes.io/projected/65820ad0-cf24-499c-b418-8980edb8788a-kube-api-access-vg2r9\") pod \"heat-operator-controller-manager-69f49c598c-s9l2b\" (UID: \"65820ad0-cf24-499c-b418-8980edb8788a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.450306 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hss\" (UniqueName: \"kubernetes.io/projected/4a638516-be5b-4a24-9d1a-cc5dbcaac3ed-kube-api-access-b9hss\") pod \"horizon-operator-controller-manager-5b9b8895d5-n5zdw\" (UID: \"4a638516-be5b-4a24-9d1a-cc5dbcaac3ed\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.451864 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.457524 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djd66\" (UniqueName: \"kubernetes.io/projected/77f20e81-cc4d-44ab-9f77-40080cc392ec-kube-api-access-djd66\") pod \"designate-operator-controller-manager-6d8bf5c495-8gxgm\" (UID: \"77f20e81-cc4d-44ab-9f77-40080cc392ec\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.457939 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.463820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjpsd\" (UniqueName: \"kubernetes.io/projected/02aea0c3-b59c-41dd-9c48-514fd4bfa94c-kube-api-access-hjpsd\") pod \"glance-operator-controller-manager-77987464f4-gfxvp\" (UID: \"02aea0c3-b59c-41dd-9c48-514fd4bfa94c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.480494 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.486845 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.492988 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.497446 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b7kc\" (UniqueName: \"kubernetes.io/projected/3519f676-e828-4ec9-8995-ecf778e36d4f-kube-api-access-6b7kc\") pod \"keystone-operator-controller-manager-b4d948c87-btm9t\" (UID: \"3519f676-e828-4ec9-8995-ecf778e36d4f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.497498 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4qfh\" (UniqueName: \"kubernetes.io/projected/835a35ac-1347-46f7-ae71-aa38e8aea7cf-kube-api-access-k4qfh\") pod \"mariadb-operator-controller-manager-6994f66f48-sbq48\" (UID: \"835a35ac-1347-46f7-ae71-aa38e8aea7cf\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.497538 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67s8v\" (UniqueName: \"kubernetes.io/projected/da28bc19-c4b2-4d9a-8357-6ce9680567ce-kube-api-access-67s8v\") pod \"ironic-operator-controller-manager-554564d7fc-55txz\" (UID: \"da28bc19-c4b2-4d9a-8357-6ce9680567ce\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.497612 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24ns\" (UniqueName: \"kubernetes.io/projected/3627c187-4d3b-49cb-9367-5758e676b0af-kube-api-access-x24ns\") pod \"manila-operator-controller-manager-54f6768c69-5t2ll\" (UID: \"3627c187-4d3b-49cb-9367-5758e676b0af\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.512558 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.512868 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.514046 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.518616 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qrt9h" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.519243 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67s8v\" (UniqueName: \"kubernetes.io/projected/da28bc19-c4b2-4d9a-8357-6ce9680567ce-kube-api-access-67s8v\") pod \"ironic-operator-controller-manager-554564d7fc-55txz\" (UID: \"da28bc19-c4b2-4d9a-8357-6ce9680567ce\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.520508 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.523478 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b7kc\" (UniqueName: \"kubernetes.io/projected/3519f676-e828-4ec9-8995-ecf778e36d4f-kube-api-access-6b7kc\") pod \"keystone-operator-controller-manager-b4d948c87-btm9t\" (UID: \"3519f676-e828-4ec9-8995-ecf778e36d4f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.529515 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.530082 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.535640 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-69cfm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.536095 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.537413 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.538669 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.539482 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pd7kj" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.548912 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.558827 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.566374 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.567427 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.570051 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vd96h" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.570225 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.571109 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.574506 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.576048 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.582295 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.583825 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.590517 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xzs65" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.590759 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-48vcm" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.598607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x24ns\" (UniqueName: \"kubernetes.io/projected/3627c187-4d3b-49cb-9367-5758e676b0af-kube-api-access-x24ns\") pod \"manila-operator-controller-manager-54f6768c69-5t2ll\" (UID: \"3627c187-4d3b-49cb-9367-5758e676b0af\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.598657 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7shh\" (UniqueName: \"kubernetes.io/projected/78799685-a70e-4b5d-ae0f-fbd4ac1f48fd-kube-api-access-x7shh\") pod \"octavia-operator-controller-manager-69f8888797-2ctsx\" (UID: \"78799685-a70e-4b5d-ae0f-fbd4ac1f48fd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.598728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4qfh\" (UniqueName: \"kubernetes.io/projected/835a35ac-1347-46f7-ae71-aa38e8aea7cf-kube-api-access-k4qfh\") pod \"mariadb-operator-controller-manager-6994f66f48-sbq48\" (UID: \"835a35ac-1347-46f7-ae71-aa38e8aea7cf\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.598777 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9f64\" (UniqueName: \"kubernetes.io/projected/269fa527-4152-4014-b070-7e651d5f7b2f-kube-api-access-c9f64\") pod \"nova-operator-controller-manager-567668f5cf-vfs4v\" (UID: \"269fa527-4152-4014-b070-7e651d5f7b2f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.598809 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9scxr\" (UniqueName: \"kubernetes.io/projected/cbb0292f-e15f-4a01-bd91-1c155779be07-kube-api-access-9scxr\") pod \"neutron-operator-controller-manager-64ddbf8bb-drwmx\" (UID: \"cbb0292f-e15f-4a01-bd91-1c155779be07\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.606414 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.616058 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.626347 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4qfh\" (UniqueName: \"kubernetes.io/projected/835a35ac-1347-46f7-ae71-aa38e8aea7cf-kube-api-access-k4qfh\") pod \"mariadb-operator-controller-manager-6994f66f48-sbq48\" (UID: \"835a35ac-1347-46f7-ae71-aa38e8aea7cf\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.634813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24ns\" (UniqueName: \"kubernetes.io/projected/3627c187-4d3b-49cb-9367-5758e676b0af-kube-api-access-x24ns\") pod \"manila-operator-controller-manager-54f6768c69-5t2ll\" (UID: \"3627c187-4d3b-49cb-9367-5758e676b0af\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.635183 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.638548 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-86mkx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.641299 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.644858 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cfcgp" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.650317 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-86mkx"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.651185 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.660586 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.663859 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.665836 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wstmc" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.680753 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.698139 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxj5h\" (UniqueName: \"kubernetes.io/projected/e2de3218-8c57-43ab-b45e-e69a92456549-kube-api-access-lxj5h\") pod \"placement-operator-controller-manager-8497b45c89-47sw8\" (UID: \"e2de3218-8c57-43ab-b45e-e69a92456549\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699671 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7shh\" (UniqueName: \"kubernetes.io/projected/78799685-a70e-4b5d-ae0f-fbd4ac1f48fd-kube-api-access-x7shh\") pod \"octavia-operator-controller-manager-69f8888797-2ctsx\" (UID: \"78799685-a70e-4b5d-ae0f-fbd4ac1f48fd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699701 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mnf\" (UniqueName: \"kubernetes.io/projected/e90352c9-520d-40dc-b9f6-3919a8bd67fb-kube-api-access-m6mnf\") pod \"ovn-operator-controller-manager-d44cf6b75-gbx94\" (UID: \"e90352c9-520d-40dc-b9f6-3919a8bd67fb\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699722 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xncdt\" (UniqueName: \"kubernetes.io/projected/f954d2f9-baf9-4d98-bee1-05598035e3a1-kube-api-access-xncdt\") pod \"swift-operator-controller-manager-68f46476f-86mkx\" (UID: \"f954d2f9-baf9-4d98-bee1-05598035e3a1\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699756 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699780 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pm27\" (UniqueName: \"kubernetes.io/projected/8fdd1d16-0b06-4553-b43a-943fb22f8961-kube-api-access-5pm27\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9f64\" (UniqueName: \"kubernetes.io/projected/269fa527-4152-4014-b070-7e651d5f7b2f-kube-api-access-c9f64\") pod \"nova-operator-controller-manager-567668f5cf-vfs4v\" (UID: \"269fa527-4152-4014-b070-7e651d5f7b2f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.699835 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9scxr\" (UniqueName: \"kubernetes.io/projected/cbb0292f-e15f-4a01-bd91-1c155779be07-kube-api-access-9scxr\") pod \"neutron-operator-controller-manager-64ddbf8bb-drwmx\" (UID: \"cbb0292f-e15f-4a01-bd91-1c155779be07\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.719936 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7shh\" (UniqueName: \"kubernetes.io/projected/78799685-a70e-4b5d-ae0f-fbd4ac1f48fd-kube-api-access-x7shh\") pod \"octavia-operator-controller-manager-69f8888797-2ctsx\" (UID: \"78799685-a70e-4b5d-ae0f-fbd4ac1f48fd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.720335 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9f64\" (UniqueName: \"kubernetes.io/projected/269fa527-4152-4014-b070-7e651d5f7b2f-kube-api-access-c9f64\") pod \"nova-operator-controller-manager-567668f5cf-vfs4v\" (UID: \"269fa527-4152-4014-b070-7e651d5f7b2f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.726824 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.743219 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-6nn8b"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.744520 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.748956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9scxr\" (UniqueName: \"kubernetes.io/projected/cbb0292f-e15f-4a01-bd91-1c155779be07-kube-api-access-9scxr\") pod \"neutron-operator-controller-manager-64ddbf8bb-drwmx\" (UID: \"cbb0292f-e15f-4a01-bd91-1c155779be07\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.751644 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nr559" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.754079 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-6nn8b"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.795576 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.801387 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mnf\" (UniqueName: \"kubernetes.io/projected/e90352c9-520d-40dc-b9f6-3919a8bd67fb-kube-api-access-m6mnf\") pod \"ovn-operator-controller-manager-d44cf6b75-gbx94\" (UID: \"e90352c9-520d-40dc-b9f6-3919a8bd67fb\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.801734 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xncdt\" (UniqueName: \"kubernetes.io/projected/f954d2f9-baf9-4d98-bee1-05598035e3a1-kube-api-access-xncdt\") pod \"swift-operator-controller-manager-68f46476f-86mkx\" (UID: \"f954d2f9-baf9-4d98-bee1-05598035e3a1\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.801869 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.801937 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pm27\" (UniqueName: \"kubernetes.io/projected/8fdd1d16-0b06-4553-b43a-943fb22f8961-kube-api-access-5pm27\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.802002 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9g7\" (UniqueName: \"kubernetes.io/projected/c7070d2d-1fcc-4ae8-9380-d0f500c95d01-kube-api-access-sm9g7\") pod \"test-operator-controller-manager-7866795846-6nn8b\" (UID: \"c7070d2d-1fcc-4ae8-9380-d0f500c95d01\") " pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.802069 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsvhq\" (UniqueName: \"kubernetes.io/projected/63e77cf1-d554-4e43-a6e0-93e671cc90fc-kube-api-access-hsvhq\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xkrff\" (UID: \"63e77cf1-d554-4e43-a6e0-93e671cc90fc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.802104 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.802160 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:20.302142244 +0000 UTC m=+777.471230466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.802218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxj5h\" (UniqueName: \"kubernetes.io/projected/e2de3218-8c57-43ab-b45e-e69a92456549-kube-api-access-lxj5h\") pod \"placement-operator-controller-manager-8497b45c89-47sw8\" (UID: \"e2de3218-8c57-43ab-b45e-e69a92456549\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.809696 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.810611 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.813783 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-78drz" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.817934 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.826632 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxj5h\" (UniqueName: \"kubernetes.io/projected/e2de3218-8c57-43ab-b45e-e69a92456549-kube-api-access-lxj5h\") pod \"placement-operator-controller-manager-8497b45c89-47sw8\" (UID: \"e2de3218-8c57-43ab-b45e-e69a92456549\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.826874 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xncdt\" (UniqueName: \"kubernetes.io/projected/f954d2f9-baf9-4d98-bee1-05598035e3a1-kube-api-access-xncdt\") pod \"swift-operator-controller-manager-68f46476f-86mkx\" (UID: \"f954d2f9-baf9-4d98-bee1-05598035e3a1\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.829765 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pm27\" (UniqueName: \"kubernetes.io/projected/8fdd1d16-0b06-4553-b43a-943fb22f8961-kube-api-access-5pm27\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.834964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mnf\" (UniqueName: \"kubernetes.io/projected/e90352c9-520d-40dc-b9f6-3919a8bd67fb-kube-api-access-m6mnf\") pod \"ovn-operator-controller-manager-d44cf6b75-gbx94\" (UID: \"e90352c9-520d-40dc-b9f6-3919a8bd67fb\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.855097 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.872110 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.873692 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.882864 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.883999 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bxsm7" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.884245 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.884376 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.904908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.905564 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwf5t\" (UniqueName: \"kubernetes.io/projected/fe9a7273-de20-4420-8335-dc291458c338-kube-api-access-gwf5t\") pod \"watcher-operator-controller-manager-5db88f68c-t6pg6\" (UID: \"fe9a7273-de20-4420-8335-dc291458c338\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.905616 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9g7\" (UniqueName: \"kubernetes.io/projected/c7070d2d-1fcc-4ae8-9380-d0f500c95d01-kube-api-access-sm9g7\") pod \"test-operator-controller-manager-7866795846-6nn8b\" (UID: \"c7070d2d-1fcc-4ae8-9380-d0f500c95d01\") " pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.905650 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsvhq\" (UniqueName: \"kubernetes.io/projected/63e77cf1-d554-4e43-a6e0-93e671cc90fc-kube-api-access-hsvhq\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xkrff\" (UID: \"63e77cf1-d554-4e43-a6e0-93e671cc90fc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.905684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.905815 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: E0218 06:01:19.905864 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:20.905840661 +0000 UTC m=+778.074928893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.908504 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.933300 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9g7\" (UniqueName: \"kubernetes.io/projected/c7070d2d-1fcc-4ae8-9380-d0f500c95d01-kube-api-access-sm9g7\") pod \"test-operator-controller-manager-7866795846-6nn8b\" (UID: \"c7070d2d-1fcc-4ae8-9380-d0f500c95d01\") " pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.938329 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsvhq\" (UniqueName: \"kubernetes.io/projected/63e77cf1-d554-4e43-a6e0-93e671cc90fc-kube-api-access-hsvhq\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xkrff\" (UID: \"63e77cf1-d554-4e43-a6e0-93e671cc90fc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.942580 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.943983 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.951879 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sqsk2" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.954504 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8"] Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.962602 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.975544 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:01:19 crc kubenswrapper[4869]: I0218 06:01:19.998654 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.008501 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vh4k\" (UniqueName: \"kubernetes.io/projected/3990868e-7ca4-439b-a244-6a3336628877-kube-api-access-7vh4k\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.008581 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.008633 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.008668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25p6\" (UniqueName: \"kubernetes.io/projected/9d84cc45-ca42-454f-9323-35d717ea7cd4-kube-api-access-b25p6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wfxk8\" (UID: \"9d84cc45-ca42-454f-9323-35d717ea7cd4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.008696 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwf5t\" (UniqueName: \"kubernetes.io/projected/fe9a7273-de20-4420-8335-dc291458c338-kube-api-access-gwf5t\") pod \"watcher-operator-controller-manager-5db88f68c-t6pg6\" (UID: \"fe9a7273-de20-4420-8335-dc291458c338\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.015108 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.031255 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwf5t\" (UniqueName: \"kubernetes.io/projected/fe9a7273-de20-4420-8335-dc291458c338-kube-api-access-gwf5t\") pod \"watcher-operator-controller-manager-5db88f68c-t6pg6\" (UID: \"fe9a7273-de20-4420-8335-dc291458c338\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.111168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.111259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.111305 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25p6\" (UniqueName: \"kubernetes.io/projected/9d84cc45-ca42-454f-9323-35d717ea7cd4-kube-api-access-b25p6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wfxk8\" (UID: \"9d84cc45-ca42-454f-9323-35d717ea7cd4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.111352 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vh4k\" (UniqueName: \"kubernetes.io/projected/3990868e-7ca4-439b-a244-6a3336628877-kube-api-access-7vh4k\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.111448 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.111536 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:20.611515634 +0000 UTC m=+777.780603866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.111844 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.111902 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:20.611890153 +0000 UTC m=+777.780978385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.133639 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25p6\" (UniqueName: \"kubernetes.io/projected/9d84cc45-ca42-454f-9323-35d717ea7cd4-kube-api-access-b25p6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wfxk8\" (UID: \"9d84cc45-ca42-454f-9323-35d717ea7cd4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.139039 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vh4k\" (UniqueName: \"kubernetes.io/projected/3990868e-7ca4-439b-a244-6a3336628877-kube-api-access-7vh4k\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.160113 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.206115 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.235420 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.275462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.319027 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.319417 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.320362 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:21.320345223 +0000 UTC m=+778.489433445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.575715 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.579143 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.606285 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.607910 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.623334 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" event={"ID":"02aea0c3-b59c-41dd-9c48-514fd4bfa94c","Type":"ContainerStarted","Data":"3dc992e089e07cea49109f326ece8f13eb27ba0e884dbfd56009b2ad5216758d"} Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.625520 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.625612 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.625716 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.625771 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:21.625758535 +0000 UTC m=+778.794846767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.625811 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.625830 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:21.625823076 +0000 UTC m=+778.794911308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.627234 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" event={"ID":"c1a14efa-4a9b-49b6-a882-c0d080269850","Type":"ContainerStarted","Data":"87c719bd7acf9905b41bf501227fdc6c0a3d8eb259fb3588911f0527786a542b"} Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.640992 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.930570 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.930939 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: E0218 06:01:20.931024 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:22.930997153 +0000 UTC m=+780.100085395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.961304 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx"] Feb 18 06:01:20 crc kubenswrapper[4869]: W0218 06:01:20.964373 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb0292f_e15f_4a01_bd91_1c155779be07.slice/crio-24ff88668b8c66e05091dc17804e54ad89c3a3b7c8c567d34d56c52bec02cf09 WatchSource:0}: Error finding container 24ff88668b8c66e05091dc17804e54ad89c3a3b7c8c567d34d56c52bec02cf09: Status 404 returned error can't find the container with id 24ff88668b8c66e05091dc17804e54ad89c3a3b7c8c567d34d56c52bec02cf09 Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.973499 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v"] Feb 18 06:01:20 crc kubenswrapper[4869]: I0218 06:01:20.983003 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48"] Feb 18 06:01:20 crc kubenswrapper[4869]: W0218 06:01:20.999271 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835a35ac_1347_46f7_ae71_aa38e8aea7cf.slice/crio-b60d2d4a816d521783c10897db2d7faf15254bc99d81272bba3d624c7ec19c5b WatchSource:0}: Error finding container b60d2d4a816d521783c10897db2d7faf15254bc99d81272bba3d624c7ec19c5b: Status 404 returned error can't find the container with id b60d2d4a816d521783c10897db2d7faf15254bc99d81272bba3d624c7ec19c5b Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.008355 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.023311 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.028157 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.032322 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.043479 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.048918 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.070434 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94"] Feb 18 06:01:21 crc kubenswrapper[4869]: W0218 06:01:21.071503 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2de3218_8c57_43ab_b45e_e69a92456549.slice/crio-865521385a683e6d670856169a48e4c5b33181ac1c227a7fc9c7ed5a8275a90b WatchSource:0}: Error finding container 865521385a683e6d670856169a48e4c5b33181ac1c227a7fc9c7ed5a8275a90b: Status 404 returned error can't find the container with id 865521385a683e6d670856169a48e4c5b33181ac1c227a7fc9c7ed5a8275a90b Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.087669 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m6mnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-gbx94_openstack-operators(e90352c9-520d-40dc-b9f6-3919a8bd67fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.087871 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67s8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-55txz_openstack-operators(da28bc19-c4b2-4d9a-8357-6ce9680567ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.092252 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podUID="da28bc19-c4b2-4d9a-8357-6ce9680567ce" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.092504 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6b7kc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-btm9t_openstack-operators(3519f676-e828-4ec9-8995-ecf778e36d4f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.094171 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podUID="3519f676-e828-4ec9-8995-ecf778e36d4f" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.094873 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podUID="e90352c9-520d-40dc-b9f6-3919a8bd67fb" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.113249 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwf5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-t6pg6_openstack-operators(fe9a7273-de20-4420-8335-dc291458c338): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: W0218 06:01:21.113680 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf954d2f9_baf9_4d98_bee1_05598035e3a1.slice/crio-c7c685962cea526846cbb887aff64235411e5f6d0a7a12ea1ee8fa15613bee28 WatchSource:0}: Error finding container c7c685962cea526846cbb887aff64235411e5f6d0a7a12ea1ee8fa15613bee28: Status 404 returned error can't find the container with id c7c685962cea526846cbb887aff64235411e5f6d0a7a12ea1ee8fa15613bee28 Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.114855 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" podUID="fe9a7273-de20-4420-8335-dc291458c338" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.118000 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xncdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-86mkx_openstack-operators(f954d2f9-baf9-4d98-bee1-05598035e3a1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.118174 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sm9g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-6nn8b_openstack-operators(c7070d2d-1fcc-4ae8-9380-d0f500c95d01): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.119260 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podUID="c7070d2d-1fcc-4ae8-9380-d0f500c95d01" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.119435 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6"] Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.119498 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podUID="f954d2f9-baf9-4d98-bee1-05598035e3a1" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.130174 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-6nn8b"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.136841 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-86mkx"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.196049 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8"] Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.341761 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.342035 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.342112 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:23.342090031 +0000 UTC m=+780.511178263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.634123 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" event={"ID":"3627c187-4d3b-49cb-9367-5758e676b0af","Type":"ContainerStarted","Data":"dea62f5bd8bd22b5a5b73bce91513402b62d950647bdbb5c563371d7fe516683"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.635498 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" event={"ID":"63e77cf1-d554-4e43-a6e0-93e671cc90fc","Type":"ContainerStarted","Data":"1bca5515081825162b99af9671e053abf234a638fae23da03b7b01231956bf38"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.639133 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" event={"ID":"e2de3218-8c57-43ab-b45e-e69a92456549","Type":"ContainerStarted","Data":"865521385a683e6d670856169a48e4c5b33181ac1c227a7fc9c7ed5a8275a90b"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.640401 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" event={"ID":"f954d2f9-baf9-4d98-bee1-05598035e3a1","Type":"ContainerStarted","Data":"c7c685962cea526846cbb887aff64235411e5f6d0a7a12ea1ee8fa15613bee28"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.641569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" event={"ID":"cbb0292f-e15f-4a01-bd91-1c155779be07","Type":"ContainerStarted","Data":"24ff88668b8c66e05091dc17804e54ad89c3a3b7c8c567d34d56c52bec02cf09"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.643127 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podUID="f954d2f9-baf9-4d98-bee1-05598035e3a1" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.644169 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" event={"ID":"fe9a7273-de20-4420-8335-dc291458c338","Type":"ContainerStarted","Data":"918a864a1204b8be4c827e92f6247396643c371109f6094c486ed176f33ff528"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.645618 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" podUID="fe9a7273-de20-4420-8335-dc291458c338" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.655466 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.655558 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.655728 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.655792 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:23.655775696 +0000 UTC m=+780.824863928 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.656298 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" event={"ID":"e5345d15-54e7-4c42-92d2-e3f4d63e9533","Type":"ContainerStarted","Data":"6cf0e7cfa8a99f3e834e91d6655c28db2cde858c57cf3b745172abd7b264ebd4"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.656384 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.656424 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:23.656409471 +0000 UTC m=+780.825497703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.659778 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" event={"ID":"65820ad0-cf24-499c-b418-8980edb8788a","Type":"ContainerStarted","Data":"4dcd39e4bc3012d6c44a5504a8cde10620e5f65aa39dccdbe949d2fe2ea3b171"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.666708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" event={"ID":"77f20e81-cc4d-44ab-9f77-40080cc392ec","Type":"ContainerStarted","Data":"1ece064f0993b1dbf2ec547bdbb31b18e5c3c139abe5d9dfb48f7ce0773910dd"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.668737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" event={"ID":"3519f676-e828-4ec9-8995-ecf778e36d4f","Type":"ContainerStarted","Data":"b893faed37164b109fe9e3c0e6c44a5711d6dc5eb3b8ed846edac971b4308612"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.670466 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podUID="3519f676-e828-4ec9-8995-ecf778e36d4f" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.670729 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" event={"ID":"78799685-a70e-4b5d-ae0f-fbd4ac1f48fd","Type":"ContainerStarted","Data":"c6d471120a7f23c49f378dfa9da9140d445caf70470d6b8cf0b9a2ea19c96221"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.672308 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" event={"ID":"e90352c9-520d-40dc-b9f6-3919a8bd67fb","Type":"ContainerStarted","Data":"278afef94ab6408a98d074cb70af2bae274f99b701e1d2c630579a66b1c8c88b"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.675405 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podUID="e90352c9-520d-40dc-b9f6-3919a8bd67fb" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.692168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" event={"ID":"835a35ac-1347-46f7-ae71-aa38e8aea7cf","Type":"ContainerStarted","Data":"b60d2d4a816d521783c10897db2d7faf15254bc99d81272bba3d624c7ec19c5b"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.693984 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" event={"ID":"9d84cc45-ca42-454f-9323-35d717ea7cd4","Type":"ContainerStarted","Data":"d87b76ca5befa5f3d3c269c87392a07aa06cf0c6da41a79834e1bd65503389c3"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.698795 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" event={"ID":"269fa527-4152-4014-b070-7e651d5f7b2f","Type":"ContainerStarted","Data":"e34fadb89cd9358af474a4605b01aea69ec81812cf6a91f7976130a81ec85d17"} Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.706666 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" event={"ID":"c7070d2d-1fcc-4ae8-9380-d0f500c95d01","Type":"ContainerStarted","Data":"8b3ca18745655c724f76d4b51a2105e99259661f7c28572b98c6fe03df7d1ddd"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.710953 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podUID="c7070d2d-1fcc-4ae8-9380-d0f500c95d01" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.712248 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" event={"ID":"da28bc19-c4b2-4d9a-8357-6ce9680567ce","Type":"ContainerStarted","Data":"1458961126c3a7e896f8ab990ca2c48211c52f8a526ebfbec0cf7a5cc08647dd"} Feb 18 06:01:21 crc kubenswrapper[4869]: E0218 06:01:21.713381 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podUID="da28bc19-c4b2-4d9a-8357-6ce9680567ce" Feb 18 06:01:21 crc kubenswrapper[4869]: I0218 06:01:21.714362 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" event={"ID":"4a638516-be5b-4a24-9d1a-cc5dbcaac3ed","Type":"ContainerStarted","Data":"96e9b9e8bda8dac1eb84ff1adbefe9c0a2ed0610bf853ccb50598f2495fc4159"} Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728622 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podUID="da28bc19-c4b2-4d9a-8357-6ce9680567ce" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728842 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podUID="3519f676-e828-4ec9-8995-ecf778e36d4f" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728891 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" podUID="fe9a7273-de20-4420-8335-dc291458c338" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728914 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podUID="c7070d2d-1fcc-4ae8-9380-d0f500c95d01" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728892 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podUID="f954d2f9-baf9-4d98-bee1-05598035e3a1" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.728923 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podUID="e90352c9-520d-40dc-b9f6-3919a8bd67fb" Feb 18 06:01:22 crc kubenswrapper[4869]: I0218 06:01:22.989921 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.990159 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:22 crc kubenswrapper[4869]: E0218 06:01:22.990260 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:26.990239805 +0000 UTC m=+784.159328037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: I0218 06:01:23.397863 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.398092 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.398136 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:27.398122764 +0000 UTC m=+784.567210996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: I0218 06:01:23.705265 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:23 crc kubenswrapper[4869]: I0218 06:01:23.705341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.705454 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.705497 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.705532 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:27.705513385 +0000 UTC m=+784.874601607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:01:23 crc kubenswrapper[4869]: E0218 06:01:23.705552 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:27.705545726 +0000 UTC m=+784.874633958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.771976 4869 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.911056 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2996617598/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.911628 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k4qfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-sbq48_openstack-operators(835a35ac-1347-46f7-ae71-aa38e8aea7cf): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2996617598/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.860896 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.912114 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:03.912093654 +0000 UTC m=+821.081181886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:01:55 crc kubenswrapper[4869]: E0218 06:01:55.913902 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2996617598/1\\\": happened during read: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" podUID="835a35ac-1347-46f7-ae71-aa38e8aea7cf" Feb 18 06:01:55 crc kubenswrapper[4869]: I0218 06:01:55.947288 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 06:01:55 crc kubenswrapper[4869]: I0218 06:01:55.947340 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T06:01:55Z","lastTransitionTime":"2026-02-18T06:01:55Z","reason":"KubeletNotReady","message":"container runtime is down"} Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.054978 4869 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.055971 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2434924389/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.056195 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x24ns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-5t2ll_openstack-operators(3627c187-4d3b-49cb-9367-5758e676b0af): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2434924389/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.056266 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2958873528/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.056357 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjpsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-gfxvp_openstack-operators(02aea0c3-b59c-41dd-9c48-514fd4bfa94c): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2958873528/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.084224 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2958873528/1\\\": happened during read: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" podUID="02aea0c3-b59c-41dd-9c48-514fd4bfa94c" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.084345 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2434924389/1\\\": happened during read: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" podUID="3627c187-4d3b-49cb-9367-5758e676b0af" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.111311 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2824745792/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.111472 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-djd66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d8bf5c495-8gxgm_openstack-operators(77f20e81-cc4d-44ab-9f77-40080cc392ec): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2824745792/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.114022 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2824745792/1\\\": happened during read: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" podUID="77f20e81-cc4d-44ab-9f77-40080cc392ec" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.324148 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1071001866/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.340529 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hsvhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-xkrff_openstack-operators(63e77cf1-d554-4e43-a6e0-93e671cc90fc): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1071001866/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.342237 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage1071001866/1\\\": happened during read: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" podUID="63e77cf1-d554-4e43-a6e0-93e671cc90fc" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.400815 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1383523365/2\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.402912 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b25p6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wfxk8_openstack-operators(9d84cc45-ca42-454f-9323-35d717ea7cd4): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1383523365/2\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: I0218 06:01:56.421153 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-59xq2" podUID="9742d031-8f05-438c-8028-700eb13042fe" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.43:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:01:56 crc kubenswrapper[4869]: I0218 06:01:55.860647 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.422334 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage1383523365/2\\\": happened during read: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" podUID="9d84cc45-ca42-454f-9323-35d717ea7cd4" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.422678 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" podUID="835a35ac-1347-46f7-ae71-aa38e8aea7cf" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.439068 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1765626252/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.440123 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3491638511/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2" Feb 18 06:01:56 crc kubenswrapper[4869]: I0218 06:01:56.474404 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:01:56 crc kubenswrapper[4869]: I0218 06:01:56.474978 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:01:56 crc kubenswrapper[4869]: I0218 06:01:56.475142 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.541012 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.541118 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:04.541096744 +0000 UTC m=+821.710184976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.542008 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.542058 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:04.542046268 +0000 UTC m=+821.711134500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.542698 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.542763 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:04.542731124 +0000 UTC m=+821.711819356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.441435 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tngq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-q5gnw_openstack-operators(c1a14efa-4a9b-49b6-a882-c0d080269850): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1765626252/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.562898 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage1765626252/1\\\": happened during read: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" podUID="c1a14efa-4a9b-49b6-a882-c0d080269850" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.608687 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2713708861/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.608934 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9hss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-n5zdw_openstack-operators(4a638516-be5b-4a24-9d1a-cc5dbcaac3ed): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2713708861/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.610681 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2713708861/1\\\": happened during read: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" podUID="4a638516-be5b-4a24-9d1a-cc5dbcaac3ed" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.621176 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2976933976/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.621366 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8gcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-knzcc_openstack-operators(e5345d15-54e7-4c42-92d2-e3f4d63e9533): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2976933976/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.626304 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2976933976/1\\\": happened during read: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" podUID="e5345d15-54e7-4c42-92d2-e3f4d63e9533" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.673379 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1307257859/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.679859 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9scxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-drwmx_openstack-operators(cbb0292f-e15f-4a01-bd91-1c155779be07): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1307257859/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.681401 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage1307257859/1\\\": happened during read: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" podUID="cbb0292f-e15f-4a01-bd91-1c155779be07" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.698916 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3286067609/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.699098 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxj5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-47sw8_openstack-operators(e2de3218-8c57-43ab-b45e-e69a92456549): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3286067609/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.705836 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1854564908/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.705992 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7shh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-2ctsx_openstack-operators(78799685-a70e-4b5d-ae0f-fbd4ac1f48fd): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage1854564908/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.706471 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3286067609/1\\\": happened during read: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" podUID="e2de3218-8c57-43ab-b45e-e69a92456549" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.712288 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vg2r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69f49c598c-s9l2b_openstack-operators(65820ad0-cf24-499c-b418-8980edb8788a): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3491638511/1\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.719238 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage1854564908/1\\\": happened during read: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" podUID="78799685-a70e-4b5d-ae0f-fbd4ac1f48fd" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.719343 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3491638511/1\\\": happened during read: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" podUID="65820ad0-cf24-499c-b418-8980edb8788a" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.772028 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3440731828/2\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.772212 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c9f64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-vfs4v_openstack-operators(269fa527-4152-4014-b070-7e651d5f7b2f): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3440731828/2\": happened during read: context canceled" logger="UnhandledError" Feb 18 06:01:56 crc kubenswrapper[4869]: E0218 06:01:56.773712 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3440731828/2\\\": happened during read: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" podUID="269fa527-4152-4014-b070-7e651d5f7b2f" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.122139 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" podUID="e2de3218-8c57-43ab-b45e-e69a92456549" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.123768 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" podUID="cbb0292f-e15f-4a01-bd91-1c155779be07" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.123845 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" podUID="77f20e81-cc4d-44ab-9f77-40080cc392ec" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.123892 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" podUID="269fa527-4152-4014-b070-7e651d5f7b2f" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.123928 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" podUID="78799685-a70e-4b5d-ae0f-fbd4ac1f48fd" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.123962 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" podUID="9d84cc45-ca42-454f-9323-35d717ea7cd4" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.124002 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" podUID="65820ad0-cf24-499c-b418-8980edb8788a" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.124049 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" podUID="3627c187-4d3b-49cb-9367-5758e676b0af" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.124088 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" podUID="4a638516-be5b-4a24-9d1a-cc5dbcaac3ed" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.124856 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" podUID="63e77cf1-d554-4e43-a6e0-93e671cc90fc" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.125625 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" podUID="02aea0c3-b59c-41dd-9c48-514fd4bfa94c" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.127060 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" podUID="c1a14efa-4a9b-49b6-a882-c0d080269850" Feb 18 06:01:57 crc kubenswrapper[4869]: E0218 06:01:57.127877 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" podUID="e5345d15-54e7-4c42-92d2-e3f4d63e9533" Feb 18 06:02:03 crc kubenswrapper[4869]: I0218 06:02:03.964501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:03 crc kubenswrapper[4869]: E0218 06:02:03.964772 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:03 crc kubenswrapper[4869]: E0218 06:02:03.965394 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:19.965366716 +0000 UTC m=+837.134454988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "metrics-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: I0218 06:02:04.576172 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:02:04 crc kubenswrapper[4869]: I0218 06:02:04.576329 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:02:04 crc kubenswrapper[4869]: I0218 06:02:04.576354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576489 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576538 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576591 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576551 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs podName:3990868e-7ca4-439b-a244-6a3336628877 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:20.576534 +0000 UTC m=+837.745622242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-nffdh" (UID: "3990868e-7ca4-439b-a244-6a3336628877") : secret "webhook-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576700 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert podName:8fdd1d16-0b06-4553-b43a-943fb22f8961 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:20.576650733 +0000 UTC m=+837.745738965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" (UID: "8fdd1d16-0b06-4553-b43a-943fb22f8961") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:04 crc kubenswrapper[4869]: E0218 06:02:04.576721 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert podName:ea6c026b-8825-42ec-8b66-9c2842957c10 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:20.576712525 +0000 UTC m=+837.745800757 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert") pod "infra-operator-controller-manager-79d975b745-fnrx8" (UID: "ea6c026b-8825-42ec-8b66-9c2842957c10") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:06 crc kubenswrapper[4869]: I0218 06:02:06.841802 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.394279 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.397664 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.408000 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.554605 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.554945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpxb\" (UniqueName: \"kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.555158 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.656232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpxb\" (UniqueName: \"kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.656659 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.656873 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.657216 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.657538 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.689592 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpxb\" (UniqueName: \"kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb\") pod \"certified-operators-xbx2b\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:09 crc kubenswrapper[4869]: I0218 06:02:09.720283 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:10 crc kubenswrapper[4869]: I0218 06:02:10.043699 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:10 crc kubenswrapper[4869]: I0218 06:02:10.197409 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerStarted","Data":"f48779a6b02b59133cb813829b63b6e2ac46b1f38c4c26fe82ac91febab1099e"} Feb 18 06:02:11 crc kubenswrapper[4869]: I0218 06:02:11.206396 4869 generic.go:334] "Generic (PLEG): container finished" podID="df649552-2364-4dc2-a040-ec6af54d9f20" containerID="0bae3229c32df696a68ca5831373321dddc834ec45f67cc82b4c0700fec662fa" exitCode=0 Feb 18 06:02:11 crc kubenswrapper[4869]: I0218 06:02:11.206510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerDied","Data":"0bae3229c32df696a68ca5831373321dddc834ec45f67cc82b4c0700fec662fa"} Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.034785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.039444 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.611779 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.612345 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.612420 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.618511 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6c026b-8825-42ec-8b66-9c2842957c10-cert\") pod \"infra-operator-controller-manager-79d975b745-fnrx8\" (UID: \"ea6c026b-8825-42ec-8b66-9c2842957c10\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.618734 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3990868e-7ca4-439b-a244-6a3336628877-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-nffdh\" (UID: \"3990868e-7ca4-439b-a244-6a3336628877\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.619356 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fdd1d16-0b06-4553-b43a-943fb22f8961-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv\" (UID: \"8fdd1d16-0b06-4553-b43a-943fb22f8961\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.769294 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn2vp" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.777337 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.831779 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vd96h" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.840216 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.865028 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bxsm7" Feb 18 06:02:20 crc kubenswrapper[4869]: I0218 06:02:20.873834 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:21 crc kubenswrapper[4869]: I0218 06:02:21.714892 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8"] Feb 18 06:02:21 crc kubenswrapper[4869]: I0218 06:02:21.723433 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh"] Feb 18 06:02:21 crc kubenswrapper[4869]: W0218 06:02:21.724518 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6c026b_8825_42ec_8b66_9c2842957c10.slice/crio-eae68977e92a90e425c2d9bdf144910daa4a07a5ac01e9ff35f12267b52a6c11 WatchSource:0}: Error finding container eae68977e92a90e425c2d9bdf144910daa4a07a5ac01e9ff35f12267b52a6c11: Status 404 returned error can't find the container with id eae68977e92a90e425c2d9bdf144910daa4a07a5ac01e9ff35f12267b52a6c11 Feb 18 06:02:21 crc kubenswrapper[4869]: I0218 06:02:21.739353 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv"] Feb 18 06:02:21 crc kubenswrapper[4869]: W0218 06:02:21.749619 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fdd1d16_0b06_4553_b43a_943fb22f8961.slice/crio-5e54ac0a94ebd24b4d1cd2d8d6b7df1366bec23c3a061ee6f830582b9d3da739 WatchSource:0}: Error finding container 5e54ac0a94ebd24b4d1cd2d8d6b7df1366bec23c3a061ee6f830582b9d3da739: Status 404 returned error can't find the container with id 5e54ac0a94ebd24b4d1cd2d8d6b7df1366bec23c3a061ee6f830582b9d3da739 Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.486545 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" event={"ID":"8fdd1d16-0b06-4553-b43a-943fb22f8961","Type":"ContainerStarted","Data":"5e54ac0a94ebd24b4d1cd2d8d6b7df1366bec23c3a061ee6f830582b9d3da739"} Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.488179 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" event={"ID":"3990868e-7ca4-439b-a244-6a3336628877","Type":"ContainerStarted","Data":"989f650bca4f55ada8f9c6ce7beaf5d26e4ec05ade40a84e70ce991f8e8e7e3f"} Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.488261 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" event={"ID":"3990868e-7ca4-439b-a244-6a3336628877","Type":"ContainerStarted","Data":"032bf2d391413e334b42cab5b7773acae7a51de39f4fbc513fdff8e9b74f07e3"} Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.488922 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.491192 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" event={"ID":"ea6c026b-8825-42ec-8b66-9c2842957c10","Type":"ContainerStarted","Data":"eae68977e92a90e425c2d9bdf144910daa4a07a5ac01e9ff35f12267b52a6c11"} Feb 18 06:02:22 crc kubenswrapper[4869]: I0218 06:02:22.513825 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" podStartSLOduration=63.513815101 podStartE2EDuration="1m3.513815101s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:02:22.509857404 +0000 UTC m=+839.678945636" watchObservedRunningTime="2026-02-18 06:02:22.513815101 +0000 UTC m=+839.682903333" Feb 18 06:02:25 crc kubenswrapper[4869]: I0218 06:02:25.753930 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" podUID="d597c072-fd48-4245-8a9a-5a80aaa78993" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.47:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:02:25 crc kubenswrapper[4869]: I0218 06:02:25.802039 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-69666c74dd-pv6sz" podUID="d597c072-fd48-4245-8a9a-5a80aaa78993" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.47:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:02:25 crc kubenswrapper[4869]: E0218 06:02:25.992687 4869 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.287s" Feb 18 06:02:26 crc kubenswrapper[4869]: I0218 06:02:26.091655 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-nffdh" Feb 18 06:02:27 crc kubenswrapper[4869]: I0218 06:02:27.071799 4869 generic.go:334] "Generic (PLEG): container finished" podID="df649552-2364-4dc2-a040-ec6af54d9f20" containerID="f258968df1bd0c455a7e71b098eb91987fa7124e26d7a4c6ce38bcc928e8ac55" exitCode=0 Feb 18 06:02:27 crc kubenswrapper[4869]: I0218 06:02:27.071907 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerDied","Data":"f258968df1bd0c455a7e71b098eb91987fa7124e26d7a4c6ce38bcc928e8ac55"} Feb 18 06:02:28 crc kubenswrapper[4869]: I0218 06:02:28.097704 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerStarted","Data":"7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3"} Feb 18 06:02:28 crc kubenswrapper[4869]: I0218 06:02:28.120721 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xbx2b" podStartSLOduration=2.736163196 podStartE2EDuration="19.120701763s" podCreationTimestamp="2026-02-18 06:02:09 +0000 UTC" firstStartedPulling="2026-02-18 06:02:11.208984933 +0000 UTC m=+828.378073185" lastFinishedPulling="2026-02-18 06:02:27.59352352 +0000 UTC m=+844.762611752" observedRunningTime="2026-02-18 06:02:28.118020278 +0000 UTC m=+845.287108510" watchObservedRunningTime="2026-02-18 06:02:28.120701763 +0000 UTC m=+845.289789995" Feb 18 06:02:29 crc kubenswrapper[4869]: I0218 06:02:29.721582 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:29 crc kubenswrapper[4869]: I0218 06:02:29.725928 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:29 crc kubenswrapper[4869]: I0218 06:02:29.776050 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:39 crc kubenswrapper[4869]: I0218 06:02:39.771683 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:39 crc kubenswrapper[4869]: I0218 06:02:39.820997 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:40 crc kubenswrapper[4869]: I0218 06:02:40.198427 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xbx2b" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="registry-server" containerID="cri-o://7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" gracePeriod=2 Feb 18 06:02:40 crc kubenswrapper[4869]: E0218 06:02:40.929104 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6" Feb 18 06:02:40 crc kubenswrapper[4869]: E0218 06:02:40.929364 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sm9g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-6nn8b_openstack-operators(c7070d2d-1fcc-4ae8-9380-d0f500c95d01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:40 crc kubenswrapper[4869]: E0218 06:02:40.930973 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podUID="c7070d2d-1fcc-4ae8-9380-d0f500c95d01" Feb 18 06:02:41 crc kubenswrapper[4869]: I0218 06:02:41.210092 4869 generic.go:334] "Generic (PLEG): container finished" podID="df649552-2364-4dc2-a040-ec6af54d9f20" containerID="7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" exitCode=0 Feb 18 06:02:41 crc kubenswrapper[4869]: I0218 06:02:41.210167 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerDied","Data":"7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3"} Feb 18 06:02:41 crc kubenswrapper[4869]: E0218 06:02:41.654780 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 18 06:02:41 crc kubenswrapper[4869]: E0218 06:02:41.654991 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hsvhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-xkrff_openstack-operators(63e77cf1-d554-4e43-a6e0-93e671cc90fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:41 crc kubenswrapper[4869]: E0218 06:02:41.656164 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" podUID="63e77cf1-d554-4e43-a6e0-93e671cc90fc" Feb 18 06:02:43 crc kubenswrapper[4869]: E0218 06:02:43.474720 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 18 06:02:43 crc kubenswrapper[4869]: E0218 06:02:43.475284 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxj5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-47sw8_openstack-operators(e2de3218-8c57-43ab-b45e-e69a92456549): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:43 crc kubenswrapper[4869]: E0218 06:02:43.476429 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" podUID="e2de3218-8c57-43ab-b45e-e69a92456549" Feb 18 06:02:44 crc kubenswrapper[4869]: E0218 06:02:44.381187 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 18 06:02:44 crc kubenswrapper[4869]: E0218 06:02:44.381821 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjpsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-gfxvp_openstack-operators(02aea0c3-b59c-41dd-9c48-514fd4bfa94c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:44 crc kubenswrapper[4869]: E0218 06:02:44.383810 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" podUID="02aea0c3-b59c-41dd-9c48-514fd4bfa94c" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.095963 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.096158 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9scxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-drwmx_openstack-operators(cbb0292f-e15f-4a01-bd91-1c155779be07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.097364 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" podUID="cbb0292f-e15f-4a01-bd91-1c155779be07" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.627167 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.627791 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tngq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-q5gnw_openstack-operators(c1a14efa-4a9b-49b6-a882-c0d080269850): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:45 crc kubenswrapper[4869]: E0218 06:02:45.629121 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" podUID="c1a14efa-4a9b-49b6-a882-c0d080269850" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.452919 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.453112 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-djd66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d8bf5c495-8gxgm_openstack-operators(77f20e81-cc4d-44ab-9f77-40080cc392ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.454279 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" podUID="77f20e81-cc4d-44ab-9f77-40080cc392ec" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.926060 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.926285 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m6mnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-gbx94_openstack-operators(e90352c9-520d-40dc-b9f6-3919a8bd67fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:46 crc kubenswrapper[4869]: E0218 06:02:46.927602 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podUID="e90352c9-520d-40dc-b9f6-3919a8bd67fb" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.363247 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.363723 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xncdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-86mkx_openstack-operators(f954d2f9-baf9-4d98-bee1-05598035e3a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.364942 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podUID="f954d2f9-baf9-4d98-bee1-05598035e3a1" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.957774 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.958201 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2q68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79d975b745-fnrx8_openstack-operators(ea6c026b-8825-42ec-8b66-9c2842957c10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:47 crc kubenswrapper[4869]: E0218 06:02:47.959354 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" podUID="ea6c026b-8825-42ec-8b66-9c2842957c10" Feb 18 06:02:48 crc kubenswrapper[4869]: E0218 06:02:48.395054 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" podUID="ea6c026b-8825-42ec-8b66-9c2842957c10" Feb 18 06:02:48 crc kubenswrapper[4869]: E0218 06:02:48.608044 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 18 06:02:48 crc kubenswrapper[4869]: E0218 06:02:48.608238 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9hss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-n5zdw_openstack-operators(4a638516-be5b-4a24-9d1a-cc5dbcaac3ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:48 crc kubenswrapper[4869]: E0218 06:02:48.609451 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" podUID="4a638516-be5b-4a24-9d1a-cc5dbcaac3ed" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.065317 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.065500 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k4qfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-sbq48_openstack-operators(835a35ac-1347-46f7-ae71-aa38e8aea7cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.066612 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" podUID="835a35ac-1347-46f7-ae71-aa38e8aea7cf" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.613335 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.613535 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67s8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-55txz_openstack-operators(da28bc19-c4b2-4d9a-8357-6ce9680567ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.615990 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podUID="da28bc19-c4b2-4d9a-8357-6ce9680567ce" Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.721363 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3 is running failed: container process not found" containerID="7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" cmd=["grpc_health_probe","-addr=:50051"] Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.721942 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3 is running failed: container process not found" containerID="7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" cmd=["grpc_health_probe","-addr=:50051"] Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.722518 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3 is running failed: container process not found" containerID="7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" cmd=["grpc_health_probe","-addr=:50051"] Feb 18 06:02:49 crc kubenswrapper[4869]: E0218 06:02:49.722558 4869 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-xbx2b" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="registry-server" Feb 18 06:02:50 crc kubenswrapper[4869]: E0218 06:02:50.241547 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 18 06:02:50 crc kubenswrapper[4869]: E0218 06:02:50.241725 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8gcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-knzcc_openstack-operators(e5345d15-54e7-4c42-92d2-e3f4d63e9533): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:50 crc kubenswrapper[4869]: E0218 06:02:50.242909 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" podUID="e5345d15-54e7-4c42-92d2-e3f4d63e9533" Feb 18 06:02:52 crc kubenswrapper[4869]: E0218 06:02:52.186210 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 06:02:52 crc kubenswrapper[4869]: E0218 06:02:52.186657 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c9f64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-vfs4v_openstack-operators(269fa527-4152-4014-b070-7e651d5f7b2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:52 crc kubenswrapper[4869]: E0218 06:02:52.187801 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" podUID="269fa527-4152-4014-b070-7e651d5f7b2f" Feb 18 06:02:53 crc kubenswrapper[4869]: E0218 06:02:53.484410 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podUID="c7070d2d-1fcc-4ae8-9380-d0f500c95d01" Feb 18 06:02:53 crc kubenswrapper[4869]: E0218 06:02:53.775290 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 18 06:02:53 crc kubenswrapper[4869]: E0218 06:02:53.775515 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b25p6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wfxk8_openstack-operators(9d84cc45-ca42-454f-9323-35d717ea7cd4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4869]: E0218 06:02:53.776845 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" podUID="9d84cc45-ca42-454f-9323-35d717ea7cd4" Feb 18 06:02:54 crc kubenswrapper[4869]: E0218 06:02:54.367181 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 18 06:02:54 crc kubenswrapper[4869]: E0218 06:02:54.367369 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6b7kc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-btm9t_openstack-operators(3519f676-e828-4ec9-8995-ecf778e36d4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:02:54 crc kubenswrapper[4869]: E0218 06:02:54.368958 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podUID="3519f676-e828-4ec9-8995-ecf778e36d4f" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.461468 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:54 crc kubenswrapper[4869]: E0218 06:02:54.473210 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" podUID="e2de3218-8c57-43ab-b45e-e69a92456549" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.503569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbx2b" event={"ID":"df649552-2364-4dc2-a040-ec6af54d9f20","Type":"ContainerDied","Data":"f48779a6b02b59133cb813829b63b6e2ac46b1f38c4c26fe82ac91febab1099e"} Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.503631 4869 scope.go:117] "RemoveContainer" containerID="7dc63e668883245831886a430744d4d22bedf0a6f62c2b214f286269208217a3" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.503856 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbx2b" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.505658 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities\") pod \"df649552-2364-4dc2-a040-ec6af54d9f20\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.505818 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stpxb\" (UniqueName: \"kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb\") pod \"df649552-2364-4dc2-a040-ec6af54d9f20\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.505880 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content\") pod \"df649552-2364-4dc2-a040-ec6af54d9f20\" (UID: \"df649552-2364-4dc2-a040-ec6af54d9f20\") " Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.510309 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities" (OuterVolumeSpecName: "utilities") pod "df649552-2364-4dc2-a040-ec6af54d9f20" (UID: "df649552-2364-4dc2-a040-ec6af54d9f20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.521917 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb" (OuterVolumeSpecName: "kube-api-access-stpxb") pod "df649552-2364-4dc2-a040-ec6af54d9f20" (UID: "df649552-2364-4dc2-a040-ec6af54d9f20"). InnerVolumeSpecName "kube-api-access-stpxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.532478 4869 scope.go:117] "RemoveContainer" containerID="f258968df1bd0c455a7e71b098eb91987fa7124e26d7a4c6ce38bcc928e8ac55" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.562644 4869 scope.go:117] "RemoveContainer" containerID="0bae3229c32df696a68ca5831373321dddc834ec45f67cc82b4c0700fec662fa" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.566988 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df649552-2364-4dc2-a040-ec6af54d9f20" (UID: "df649552-2364-4dc2-a040-ec6af54d9f20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.609449 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.609493 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stpxb\" (UniqueName: \"kubernetes.io/projected/df649552-2364-4dc2-a040-ec6af54d9f20-kube-api-access-stpxb\") on node \"crc\" DevicePath \"\"" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.609503 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df649552-2364-4dc2-a040-ec6af54d9f20-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.833210 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:54 crc kubenswrapper[4869]: I0218 06:02:54.846096 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xbx2b"] Feb 18 06:02:55 crc kubenswrapper[4869]: E0218 06:02:55.472158 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" podUID="63e77cf1-d554-4e43-a6e0-93e671cc90fc" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.478143 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" path="/var/lib/kubelet/pods/df649552-2364-4dc2-a040-ec6af54d9f20/volumes" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.509676 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" event={"ID":"8fdd1d16-0b06-4553-b43a-943fb22f8961","Type":"ContainerStarted","Data":"a90f9fe1ae7bbfa94a0fd3e44458aef92a7b6a0be40946eb894402b8c79accd8"} Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.509792 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.511438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" event={"ID":"78799685-a70e-4b5d-ae0f-fbd4ac1f48fd","Type":"ContainerStarted","Data":"e1d81ba5ac51541c114c38037bff23c91f5c9a32f58dc2bec64774cf5af55cc4"} Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.511963 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.512843 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" event={"ID":"65820ad0-cf24-499c-b418-8980edb8788a","Type":"ContainerStarted","Data":"aa028fe3f0810878524f13f3ebcecd7a95b7c3f2639c720d2cf0412d818b7bc3"} Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.513025 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.514248 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" event={"ID":"fe9a7273-de20-4420-8335-dc291458c338","Type":"ContainerStarted","Data":"8c952a71b910a2ce3bc0153cb547a16e65d20a30c632323dd14638a7b0166139"} Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.514385 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.516487 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" event={"ID":"3627c187-4d3b-49cb-9367-5758e676b0af","Type":"ContainerStarted","Data":"62f2d6839e594d5b0c2a985a56f1c3c4793c225ff404c1b579d10332a268e284"} Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.516674 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.543346 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" podStartSLOduration=63.95632012 podStartE2EDuration="1m36.543329598s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:02:21.760013841 +0000 UTC m=+838.929102063" lastFinishedPulling="2026-02-18 06:02:54.347023309 +0000 UTC m=+871.516111541" observedRunningTime="2026-02-18 06:02:55.537010864 +0000 UTC m=+872.706099096" watchObservedRunningTime="2026-02-18 06:02:55.543329598 +0000 UTC m=+872.712417840" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.570243 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" podStartSLOduration=3.331074218 podStartE2EDuration="1m36.570228407s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.113012246 +0000 UTC m=+778.282100478" lastFinishedPulling="2026-02-18 06:02:54.352166435 +0000 UTC m=+871.521254667" observedRunningTime="2026-02-18 06:02:55.555844934 +0000 UTC m=+872.724933166" watchObservedRunningTime="2026-02-18 06:02:55.570228407 +0000 UTC m=+872.739316639" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.572800 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" podStartSLOduration=3.298119211 podStartE2EDuration="1m36.572793959s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.073282504 +0000 UTC m=+778.242370736" lastFinishedPulling="2026-02-18 06:02:54.347957252 +0000 UTC m=+871.517045484" observedRunningTime="2026-02-18 06:02:55.566647769 +0000 UTC m=+872.735736001" watchObservedRunningTime="2026-02-18 06:02:55.572793959 +0000 UTC m=+872.741882191" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.591283 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" podStartSLOduration=2.867808182 podStartE2EDuration="1m36.591259921s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.623629322 +0000 UTC m=+777.792717544" lastFinishedPulling="2026-02-18 06:02:54.347081051 +0000 UTC m=+871.516169283" observedRunningTime="2026-02-18 06:02:55.578833987 +0000 UTC m=+872.747922219" watchObservedRunningTime="2026-02-18 06:02:55.591259921 +0000 UTC m=+872.760348163" Feb 18 06:02:55 crc kubenswrapper[4869]: I0218 06:02:55.602003 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" podStartSLOduration=3.312859592 podStartE2EDuration="1m36.601989973s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.057926779 +0000 UTC m=+778.227015011" lastFinishedPulling="2026-02-18 06:02:54.34705716 +0000 UTC m=+871.516145392" observedRunningTime="2026-02-18 06:02:55.600117048 +0000 UTC m=+872.769205280" watchObservedRunningTime="2026-02-18 06:02:55.601989973 +0000 UTC m=+872.771078205" Feb 18 06:02:56 crc kubenswrapper[4869]: E0218 06:02:56.474940 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" podUID="02aea0c3-b59c-41dd-9c48-514fd4bfa94c" Feb 18 06:02:58 crc kubenswrapper[4869]: E0218 06:02:58.470411 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podUID="e90352c9-520d-40dc-b9f6-3919a8bd67fb" Feb 18 06:02:59 crc kubenswrapper[4869]: E0218 06:02:59.471779 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" podUID="77f20e81-cc4d-44ab-9f77-40080cc392ec" Feb 18 06:02:59 crc kubenswrapper[4869]: E0218 06:02:59.471834 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" podUID="cbb0292f-e15f-4a01-bd91-1c155779be07" Feb 18 06:02:59 crc kubenswrapper[4869]: I0218 06:02:59.543498 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-s9l2b" Feb 18 06:02:59 crc kubenswrapper[4869]: I0218 06:02:59.731126 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5t2ll" Feb 18 06:02:59 crc kubenswrapper[4869]: I0218 06:02:59.909933 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2ctsx" Feb 18 06:03:00 crc kubenswrapper[4869]: I0218 06:03:00.210085 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-t6pg6" Feb 18 06:03:00 crc kubenswrapper[4869]: E0218 06:03:00.471089 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" podUID="c1a14efa-4a9b-49b6-a882-c0d080269850" Feb 18 06:03:00 crc kubenswrapper[4869]: I0218 06:03:00.848067 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv" Feb 18 06:03:01 crc kubenswrapper[4869]: E0218 06:03:01.472305 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podUID="f954d2f9-baf9-4d98-bee1-05598035e3a1" Feb 18 06:03:01 crc kubenswrapper[4869]: E0218 06:03:01.472385 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" podUID="4a638516-be5b-4a24-9d1a-cc5dbcaac3ed" Feb 18 06:03:01 crc kubenswrapper[4869]: I0218 06:03:01.472513 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:03:02 crc kubenswrapper[4869]: E0218 06:03:02.471317 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" podUID="835a35ac-1347-46f7-ae71-aa38e8aea7cf" Feb 18 06:03:02 crc kubenswrapper[4869]: E0218 06:03:02.471506 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podUID="da28bc19-c4b2-4d9a-8357-6ce9680567ce" Feb 18 06:03:02 crc kubenswrapper[4869]: I0218 06:03:02.559012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" event={"ID":"ea6c026b-8825-42ec-8b66-9c2842957c10","Type":"ContainerStarted","Data":"d2c4b6d6b9e24766c4385e51132ea3c8f84f07c10446f9720fb65d1bdb198f1f"} Feb 18 06:03:02 crc kubenswrapper[4869]: I0218 06:03:02.559871 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:03:02 crc kubenswrapper[4869]: I0218 06:03:02.575867 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" podStartSLOduration=63.432573827 podStartE2EDuration="1m43.575852457s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:02:21.736591349 +0000 UTC m=+838.905679581" lastFinishedPulling="2026-02-18 06:03:01.879869959 +0000 UTC m=+879.048958211" observedRunningTime="2026-02-18 06:03:02.572172797 +0000 UTC m=+879.741261029" watchObservedRunningTime="2026-02-18 06:03:02.575852457 +0000 UTC m=+879.744940679" Feb 18 06:03:04 crc kubenswrapper[4869]: E0218 06:03:04.472203 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" podUID="9d84cc45-ca42-454f-9323-35d717ea7cd4" Feb 18 06:03:05 crc kubenswrapper[4869]: E0218 06:03:05.471246 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" podUID="e5345d15-54e7-4c42-92d2-e3f4d63e9533" Feb 18 06:03:05 crc kubenswrapper[4869]: E0218 06:03:05.471584 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" podUID="269fa527-4152-4014-b070-7e651d5f7b2f" Feb 18 06:03:06 crc kubenswrapper[4869]: I0218 06:03:06.586150 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" event={"ID":"e2de3218-8c57-43ab-b45e-e69a92456549","Type":"ContainerStarted","Data":"480bbad2c50d7b24cc6c2f5ea7ffe98f54769d3a5f90fcf3ebf4a46191b2f339"} Feb 18 06:03:06 crc kubenswrapper[4869]: I0218 06:03:06.586726 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:03:06 crc kubenswrapper[4869]: I0218 06:03:06.607159 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" podStartSLOduration=2.779676867 podStartE2EDuration="1m47.607143047s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.085360609 +0000 UTC m=+778.254448841" lastFinishedPulling="2026-02-18 06:03:05.912826769 +0000 UTC m=+883.081915021" observedRunningTime="2026-02-18 06:03:06.602225637 +0000 UTC m=+883.771313869" watchObservedRunningTime="2026-02-18 06:03:06.607143047 +0000 UTC m=+883.776231269" Feb 18 06:03:07 crc kubenswrapper[4869]: E0218 06:03:07.472483 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podUID="3519f676-e828-4ec9-8995-ecf778e36d4f" Feb 18 06:03:10 crc kubenswrapper[4869]: I0218 06:03:10.133553 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:03:10 crc kubenswrapper[4869]: I0218 06:03:10.134261 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:03:10 crc kubenswrapper[4869]: I0218 06:03:10.783700 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-fnrx8" Feb 18 06:03:11 crc kubenswrapper[4869]: I0218 06:03:11.729937 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" event={"ID":"c7070d2d-1fcc-4ae8-9380-d0f500c95d01","Type":"ContainerStarted","Data":"8623e2dbcd081dca2313690010c2cb59978da151b14cca6f45956b32f5ab3459"} Feb 18 06:03:15 crc kubenswrapper[4869]: I0218 06:03:15.755556 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:03:15 crc kubenswrapper[4869]: I0218 06:03:15.771083 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" podStartSLOduration=9.998954485 podStartE2EDuration="1m56.771069033s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.118086911 +0000 UTC m=+778.287175143" lastFinishedPulling="2026-02-18 06:03:07.890201459 +0000 UTC m=+885.059289691" observedRunningTime="2026-02-18 06:03:15.770168242 +0000 UTC m=+892.939256474" watchObservedRunningTime="2026-02-18 06:03:15.771069033 +0000 UTC m=+892.940157265" Feb 18 06:03:16 crc kubenswrapper[4869]: I0218 06:03:16.766334 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-6nn8b" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.781294 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" event={"ID":"4a638516-be5b-4a24-9d1a-cc5dbcaac3ed","Type":"ContainerStarted","Data":"4c43457ea1df45dd620d85781599a073b99ca4ace61ea3d362866b79299888d1"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.781517 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.792465 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" event={"ID":"77f20e81-cc4d-44ab-9f77-40080cc392ec","Type":"ContainerStarted","Data":"c3b332675142881e0f4358bb35924a7a6487a58fd22be8eded9202d4e0053292"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.793185 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.806993 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" event={"ID":"cbb0292f-e15f-4a01-bd91-1c155779be07","Type":"ContainerStarted","Data":"f863ee1ee67b77f65f4f5c4fe00c0cb5b0a54e4132017dd1ee360552c396fe4d"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.807645 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.814319 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" podStartSLOduration=1.891011485 podStartE2EDuration="1m58.814297095s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.653975585 +0000 UTC m=+777.823063817" lastFinishedPulling="2026-02-18 06:03:17.577261195 +0000 UTC m=+894.746349427" observedRunningTime="2026-02-18 06:03:17.808209535 +0000 UTC m=+894.977297767" watchObservedRunningTime="2026-02-18 06:03:17.814297095 +0000 UTC m=+894.983385327" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.831991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" event={"ID":"835a35ac-1347-46f7-ae71-aa38e8aea7cf","Type":"ContainerStarted","Data":"9ec20fba7189e9932141a7d923f7f5e2122a10eb266d7d53289e89b7162d4e4f"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.832648 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.849038 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" event={"ID":"9d84cc45-ca42-454f-9323-35d717ea7cd4","Type":"ContainerStarted","Data":"c73cf48ff8a45fa8d35b2b65230784c9785e53e2c3a98e66655b3b9b113efcd5"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.866033 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" podStartSLOduration=2.232637713 podStartE2EDuration="1m58.86600728s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.62026155 +0000 UTC m=+777.789349782" lastFinishedPulling="2026-02-18 06:03:17.253631117 +0000 UTC m=+894.422719349" observedRunningTime="2026-02-18 06:03:17.860309251 +0000 UTC m=+895.029397493" watchObservedRunningTime="2026-02-18 06:03:17.86600728 +0000 UTC m=+895.035095502" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.872648 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" event={"ID":"02aea0c3-b59c-41dd-9c48-514fd4bfa94c","Type":"ContainerStarted","Data":"7158b7893e4e46b5cf9fa62bf556cdec2134e1e22f4b19789404f1933c0d9714"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.874263 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.909859 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" podStartSLOduration=2.6215413180000002 podStartE2EDuration="1m58.909841872s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.966194784 +0000 UTC m=+778.135283016" lastFinishedPulling="2026-02-18 06:03:17.254495338 +0000 UTC m=+894.423583570" observedRunningTime="2026-02-18 06:03:17.908368196 +0000 UTC m=+895.077456428" watchObservedRunningTime="2026-02-18 06:03:17.909841872 +0000 UTC m=+895.078930104" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.910408 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" event={"ID":"e90352c9-520d-40dc-b9f6-3919a8bd67fb","Type":"ContainerStarted","Data":"1f32dd5af4de4d1de2c59ff39ddafd363248f36b11b3d48fa848d981abc2af92"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.911369 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.927787 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" event={"ID":"63e77cf1-d554-4e43-a6e0-93e671cc90fc","Type":"ContainerStarted","Data":"a130fc05c1c9cf643e6d5c52f10e355337bfa254e92c5f51d32e3258c2474bce"} Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.928128 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.973993 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" podStartSLOduration=2.723484573 podStartE2EDuration="1m58.973968651s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.00321334 +0000 UTC m=+778.172301562" lastFinishedPulling="2026-02-18 06:03:17.253697408 +0000 UTC m=+894.422785640" observedRunningTime="2026-02-18 06:03:17.965272288 +0000 UTC m=+895.134360540" watchObservedRunningTime="2026-02-18 06:03:17.973968651 +0000 UTC m=+895.143056883" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.975458 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wfxk8" podStartSLOduration=2.615138431 podStartE2EDuration="1m58.975446667s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.218359153 +0000 UTC m=+778.387447385" lastFinishedPulling="2026-02-18 06:03:17.578667389 +0000 UTC m=+894.747755621" observedRunningTime="2026-02-18 06:03:17.936068244 +0000 UTC m=+895.105156466" watchObservedRunningTime="2026-02-18 06:03:17.975446667 +0000 UTC m=+895.144534899" Feb 18 06:03:17 crc kubenswrapper[4869]: I0218 06:03:17.997413 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" podStartSLOduration=2.354602577 podStartE2EDuration="1m58.997394124s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.610934102 +0000 UTC m=+777.780022334" lastFinishedPulling="2026-02-18 06:03:17.253725659 +0000 UTC m=+894.422813881" observedRunningTime="2026-02-18 06:03:17.990983767 +0000 UTC m=+895.160071999" watchObservedRunningTime="2026-02-18 06:03:17.997394124 +0000 UTC m=+895.166482356" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.026663 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" podStartSLOduration=2.8501692 podStartE2EDuration="1m59.026643989s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.072471634 +0000 UTC m=+778.241559866" lastFinishedPulling="2026-02-18 06:03:17.248946423 +0000 UTC m=+894.418034655" observedRunningTime="2026-02-18 06:03:18.024667081 +0000 UTC m=+895.193755303" watchObservedRunningTime="2026-02-18 06:03:18.026643989 +0000 UTC m=+895.195732221" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.055009 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" podStartSLOduration=2.887280601 podStartE2EDuration="1m59.054989304s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.087463301 +0000 UTC m=+778.256551533" lastFinishedPulling="2026-02-18 06:03:17.255172014 +0000 UTC m=+894.424260236" observedRunningTime="2026-02-18 06:03:18.048159676 +0000 UTC m=+895.217247918" watchObservedRunningTime="2026-02-18 06:03:18.054989304 +0000 UTC m=+895.224077536" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.935048 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" event={"ID":"f954d2f9-baf9-4d98-bee1-05598035e3a1","Type":"ContainerStarted","Data":"71c766c6db7e90246bc8bb631ed7cf23c32b3ad1c2fbf2eef5d3a3ce45e6e1a0"} Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.935218 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.936904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" event={"ID":"da28bc19-c4b2-4d9a-8357-6ce9680567ce","Type":"ContainerStarted","Data":"c156c47172e69ad859d5d01beefbb4cba65f5b70da6a26e473888f5ca8bab930"} Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.937046 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.939016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" event={"ID":"c1a14efa-4a9b-49b6-a882-c0d080269850","Type":"ContainerStarted","Data":"89e6ce6b9fc1c83098b795c232ed73b65d21d6ab3ccd7e9ba475f4a7043ff997"} Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.957324 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" podStartSLOduration=3.385769815 podStartE2EDuration="1m59.957304919s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.117814963 +0000 UTC m=+778.286903195" lastFinishedPulling="2026-02-18 06:03:17.689350067 +0000 UTC m=+894.858438299" observedRunningTime="2026-02-18 06:03:18.950041362 +0000 UTC m=+896.119129604" watchObservedRunningTime="2026-02-18 06:03:18.957304919 +0000 UTC m=+896.126393151" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.971272 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" podStartSLOduration=3.224669594 podStartE2EDuration="1m59.971254651s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.087637095 +0000 UTC m=+778.256725317" lastFinishedPulling="2026-02-18 06:03:17.834222142 +0000 UTC m=+895.003310374" observedRunningTime="2026-02-18 06:03:18.966514345 +0000 UTC m=+896.135602577" watchObservedRunningTime="2026-02-18 06:03:18.971254651 +0000 UTC m=+896.140342883" Feb 18 06:03:18 crc kubenswrapper[4869]: I0218 06:03:18.990458 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" podStartSLOduration=2.597374276 podStartE2EDuration="1m59.99043764s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.298485947 +0000 UTC m=+777.467574169" lastFinishedPulling="2026-02-18 06:03:17.691549301 +0000 UTC m=+894.860637533" observedRunningTime="2026-02-18 06:03:18.986229617 +0000 UTC m=+896.155317849" watchObservedRunningTime="2026-02-18 06:03:18.99043764 +0000 UTC m=+896.159525872" Feb 18 06:03:19 crc kubenswrapper[4869]: I0218 06:03:19.459361 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:03:19 crc kubenswrapper[4869]: I0218 06:03:19.946131 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" event={"ID":"e5345d15-54e7-4c42-92d2-e3f4d63e9533","Type":"ContainerStarted","Data":"465bc0b1ac8523589f84c99735309c4c3ded8a9456a62799af3e1083ed3e6b03"} Feb 18 06:03:19 crc kubenswrapper[4869]: I0218 06:03:19.946839 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:03:19 crc kubenswrapper[4869]: I0218 06:03:19.965244 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" podStartSLOduration=2.723837751 podStartE2EDuration="2m0.96522736s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.623407757 +0000 UTC m=+777.792495989" lastFinishedPulling="2026-02-18 06:03:18.864797366 +0000 UTC m=+896.033885598" observedRunningTime="2026-02-18 06:03:19.9644163 +0000 UTC m=+897.133504522" watchObservedRunningTime="2026-02-18 06:03:19.96522736 +0000 UTC m=+897.134315592" Feb 18 06:03:19 crc kubenswrapper[4869]: I0218 06:03:19.979388 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-47sw8" Feb 18 06:03:20 crc kubenswrapper[4869]: I0218 06:03:20.954825 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" event={"ID":"269fa527-4152-4014-b070-7e651d5f7b2f","Type":"ContainerStarted","Data":"cf143adde2f071d4d8430142cbb0a714f45fb2ee919db6cb19be03df74480d2f"} Feb 18 06:03:20 crc kubenswrapper[4869]: I0218 06:03:20.955393 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:03:20 crc kubenswrapper[4869]: I0218 06:03:20.980332 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" podStartSLOduration=3.078497777 podStartE2EDuration="2m1.980304665s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:20.995714056 +0000 UTC m=+778.164802288" lastFinishedPulling="2026-02-18 06:03:19.897520934 +0000 UTC m=+897.066609176" observedRunningTime="2026-02-18 06:03:20.974586895 +0000 UTC m=+898.143675157" watchObservedRunningTime="2026-02-18 06:03:20.980304665 +0000 UTC m=+898.149392907" Feb 18 06:03:24 crc kubenswrapper[4869]: I0218 06:03:24.007132 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" event={"ID":"3519f676-e828-4ec9-8995-ecf778e36d4f","Type":"ContainerStarted","Data":"097873ecc1a1535c4bf2491e1627cac7748c56677ecbe7e692c642c90c780288"} Feb 18 06:03:24 crc kubenswrapper[4869]: I0218 06:03:24.007916 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:03:24 crc kubenswrapper[4869]: I0218 06:03:24.027156 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" podStartSLOduration=3.232371912 podStartE2EDuration="2m5.02713374s" podCreationTimestamp="2026-02-18 06:01:19 +0000 UTC" firstStartedPulling="2026-02-18 06:01:21.086782074 +0000 UTC m=+778.255870306" lastFinishedPulling="2026-02-18 06:03:22.881543892 +0000 UTC m=+900.050632134" observedRunningTime="2026-02-18 06:03:24.024295751 +0000 UTC m=+901.193384003" watchObservedRunningTime="2026-02-18 06:03:24.02713374 +0000 UTC m=+901.196221982" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.463612 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-q5gnw" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.486316 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-knzcc" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.491816 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8gxgm" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.515655 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-gfxvp" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.586218 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-n5zdw" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.612837 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-55txz" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.702245 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btm9t" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.798789 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-sbq48" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.858045 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-drwmx" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.886475 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-vfs4v" Feb 18 06:03:29 crc kubenswrapper[4869]: I0218 06:03:29.966338 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-gbx94" Feb 18 06:03:30 crc kubenswrapper[4869]: I0218 06:03:30.001716 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-86mkx" Feb 18 06:03:30 crc kubenswrapper[4869]: I0218 06:03:30.019783 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xkrff" Feb 18 06:03:40 crc kubenswrapper[4869]: I0218 06:03:40.133780 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:03:40 crc kubenswrapper[4869]: I0218 06:03:40.134469 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.156637 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:03:49 crc kubenswrapper[4869]: E0218 06:03:49.158338 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="extract-utilities" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.158355 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="extract-utilities" Feb 18 06:03:49 crc kubenswrapper[4869]: E0218 06:03:49.158366 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="extract-content" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.158375 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="extract-content" Feb 18 06:03:49 crc kubenswrapper[4869]: E0218 06:03:49.158392 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="registry-server" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.158399 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="registry-server" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.158566 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="df649552-2364-4dc2-a040-ec6af54d9f20" containerName="registry-server" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.159413 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.161959 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.162172 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.162295 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.165018 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-czcwg" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.170721 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.200579 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.201048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqz7f\" (UniqueName: \"kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.257108 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.258117 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.259978 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.281128 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.302257 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.302319 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.302342 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.302392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqz7f\" (UniqueName: \"kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.302413 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pwv\" (UniqueName: \"kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.303384 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.322905 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqz7f\" (UniqueName: \"kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f\") pod \"dnsmasq-dns-675f4bcbfc-ssqcs\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.403393 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.403441 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.403488 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pwv\" (UniqueName: \"kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.404299 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.404501 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.419365 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pwv\" (UniqueName: \"kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv\") pod \"dnsmasq-dns-78dd6ddcc-l7tx6\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.492952 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.576663 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:03:49 crc kubenswrapper[4869]: I0218 06:03:49.733716 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:03:50 crc kubenswrapper[4869]: I0218 06:03:50.023158 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:03:50 crc kubenswrapper[4869]: W0218 06:03:50.027768 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e774cc7_59dc_4b84_b0b7_4d2171a154c1.slice/crio-15912416a158ac601772a24caefe70661b6dbcb6490548439db776e072bf7a6c WatchSource:0}: Error finding container 15912416a158ac601772a24caefe70661b6dbcb6490548439db776e072bf7a6c: Status 404 returned error can't find the container with id 15912416a158ac601772a24caefe70661b6dbcb6490548439db776e072bf7a6c Feb 18 06:03:50 crc kubenswrapper[4869]: I0218 06:03:50.213733 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" event={"ID":"6e774cc7-59dc-4b84-b0b7-4d2171a154c1","Type":"ContainerStarted","Data":"15912416a158ac601772a24caefe70661b6dbcb6490548439db776e072bf7a6c"} Feb 18 06:03:50 crc kubenswrapper[4869]: I0218 06:03:50.214564 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" event={"ID":"5147618e-26e5-44b5-a963-be10c5d6513b","Type":"ContainerStarted","Data":"0a8cd41d55277bba49fd85b9c66d083897e57fbb1d8cbff3a34df2dd6d9c009f"} Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.825304 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.854953 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.865285 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.874946 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.940678 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.940760 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9lb\" (UniqueName: \"kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:51 crc kubenswrapper[4869]: I0218 06:03:51.940790 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.042728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9lb\" (UniqueName: \"kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.042797 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.042889 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.043808 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.044656 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.071638 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9lb\" (UniqueName: \"kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb\") pod \"dnsmasq-dns-666b6646f7-dgrsz\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.130199 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.157771 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.158960 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.189196 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.209777 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.247067 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.247165 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.247202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4rh\" (UniqueName: \"kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.350192 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.350353 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.350392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4rh\" (UniqueName: \"kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.352021 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.353493 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.375117 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4rh\" (UniqueName: \"kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh\") pod \"dnsmasq-dns-57d769cc4f-tlzx5\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.482250 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.643539 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:03:52 crc kubenswrapper[4869]: I0218 06:03:52.999519 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.013966 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.015148 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.019592 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dtzm6" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.019779 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.019953 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.020068 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.020295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.020407 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.021117 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.028451 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081621 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081686 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081767 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081788 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081823 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081843 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081866 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081890 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.081935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhxv\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.082179 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.082235 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184531 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184553 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184573 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184612 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhxv\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184641 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184722 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184777 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184803 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.184843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.186310 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.186684 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.186910 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.186988 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.188156 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.189265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.193709 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.194933 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.202635 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.202798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.205733 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhxv\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.207993 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.242589 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" event={"ID":"5935f943-3934-43be-a2c9-6b12d9cb8188","Type":"ContainerStarted","Data":"0eee79ceaf576ec964448f0c09905a1f62771ee9b948bae313d0ab275db29304"} Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.287113 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.288772 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294386 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4bnpk" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294425 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294486 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294491 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294565 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294664 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.294794 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.297121 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.360764 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387360 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387388 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387414 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387431 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387449 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387477 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxmj\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387501 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387517 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387537 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.387555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491146 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491211 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491283 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491438 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491465 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491486 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491504 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491522 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.491562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxmj\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.492244 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.492907 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.493875 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.494244 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.494460 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.495400 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.495440 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.498351 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.501446 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.501935 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.509865 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxmj\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.531736 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:53 crc kubenswrapper[4869]: I0218 06:03:53.647896 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.406350 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.408952 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.417244 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.417369 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.418209 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.418461 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7nqg2" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.421634 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.425027 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506228 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506302 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32064888-24ad-482d-ba16-36bfb48b069e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506376 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506398 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-config-data-default\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506426 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506448 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506466 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-kolla-config\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.506497 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8sh\" (UniqueName: \"kubernetes.io/projected/32064888-24ad-482d-ba16-36bfb48b069e-kube-api-access-gs8sh\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.607821 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608137 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32064888-24ad-482d-ba16-36bfb48b069e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608236 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-config-data-default\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608263 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608283 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608298 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-kolla-config\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608321 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8sh\" (UniqueName: \"kubernetes.io/projected/32064888-24ad-482d-ba16-36bfb48b069e-kube-api-access-gs8sh\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.608072 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.611154 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-config-data-default\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.611577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32064888-24ad-482d-ba16-36bfb48b069e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.618051 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.620934 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.622231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32064888-24ad-482d-ba16-36bfb48b069e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.624298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32064888-24ad-482d-ba16-36bfb48b069e-kolla-config\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.625476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.627330 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8sh\" (UniqueName: \"kubernetes.io/projected/32064888-24ad-482d-ba16-36bfb48b069e-kube-api-access-gs8sh\") pod \"openstack-galera-0\" (UID: \"32064888-24ad-482d-ba16-36bfb48b069e\") " pod="openstack/openstack-galera-0" Feb 18 06:03:54 crc kubenswrapper[4869]: I0218 06:03:54.779450 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.781922 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.783586 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.785968 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.786532 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.788968 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.792141 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lktns" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.809502 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827260 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827538 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827670 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827716 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5txn\" (UniqueName: \"kubernetes.io/projected/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kube-api-access-r5txn\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827803 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827851 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.827999 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.828054 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929206 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929317 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929358 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929449 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929506 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929539 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5txn\" (UniqueName: \"kubernetes.io/projected/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kube-api-access-r5txn\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.929571 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.931266 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.931321 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.932118 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.932377 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.932983 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.935511 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.937237 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.950859 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5txn\" (UniqueName: \"kubernetes.io/projected/23dc38a3-9ce0-4f1f-9495-2dc65f2474e5-kube-api-access-r5txn\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:55 crc kubenswrapper[4869]: I0218 06:03:55.954392 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.105042 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.106141 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.108360 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-cp69s" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.108448 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.108361 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.112122 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.118160 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.131415 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.131791 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stbj\" (UniqueName: \"kubernetes.io/projected/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kube-api-access-4stbj\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.131850 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kolla-config\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.131906 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.131959 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-config-data\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.233269 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-config-data\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.233322 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.233352 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4stbj\" (UniqueName: \"kubernetes.io/projected/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kube-api-access-4stbj\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.233391 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kolla-config\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.233466 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.234358 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kolla-config\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.234490 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-config-data\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.239407 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.251525 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.253060 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4stbj\" (UniqueName: \"kubernetes.io/projected/a6c89f7e-0259-4c42-9e24-cd8391cda1a3-kube-api-access-4stbj\") pod \"memcached-0\" (UID: \"a6c89f7e-0259-4c42-9e24-cd8391cda1a3\") " pod="openstack/memcached-0" Feb 18 06:03:56 crc kubenswrapper[4869]: I0218 06:03:56.424152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 06:03:57 crc kubenswrapper[4869]: I0218 06:03:57.304707 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" event={"ID":"bb0523a5-e2bc-45f8-aff2-de770dcf88b2","Type":"ContainerStarted","Data":"e8c539975cfd907eef7330a42d2e72e3e1c40379c945001303cefd2c7c1ed78f"} Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.175164 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.176037 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.179551 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-49cvh" Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.190124 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.281657 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mshl\" (UniqueName: \"kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl\") pod \"kube-state-metrics-0\" (UID: \"128630be-af69-4db6-bad0-59f17dc9dec0\") " pod="openstack/kube-state-metrics-0" Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.382980 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mshl\" (UniqueName: \"kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl\") pod \"kube-state-metrics-0\" (UID: \"128630be-af69-4db6-bad0-59f17dc9dec0\") " pod="openstack/kube-state-metrics-0" Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.413260 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mshl\" (UniqueName: \"kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl\") pod \"kube-state-metrics-0\" (UID: \"128630be-af69-4db6-bad0-59f17dc9dec0\") " pod="openstack/kube-state-metrics-0" Feb 18 06:03:58 crc kubenswrapper[4869]: I0218 06:03:58.493762 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.300864 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6zzxt"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.303615 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.306851 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.306956 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.309819 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rfc22" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.313774 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5czp2"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.321735 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6zzxt"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.321847 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327664 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbjt\" (UniqueName: \"kubernetes.io/projected/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-kube-api-access-lpbjt\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327708 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327734 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-ovn-controller-tls-certs\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327800 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-scripts\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327829 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327851 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-combined-ca-bundle\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.327905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-log-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.353258 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5czp2"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428676 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-log\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428754 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-log-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428803 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6nd\" (UniqueName: \"kubernetes.io/projected/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-kube-api-access-hz6nd\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428832 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbjt\" (UniqueName: \"kubernetes.io/projected/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-kube-api-access-lpbjt\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428857 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428881 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-scripts\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-ovn-controller-tls-certs\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428954 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-run\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.428988 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-scripts\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429005 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-lib\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429032 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-etc-ovs\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429065 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-combined-ca-bundle\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429929 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-log-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.429949 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run-ovn\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.431788 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-scripts\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.431896 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-var-run\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.435286 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-ovn-controller-tls-certs\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.435365 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-combined-ca-bundle\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.452145 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbjt\" (UniqueName: \"kubernetes.io/projected/0b85434e-56f8-4cab-91a5-8cf0ea0356fc-kube-api-access-lpbjt\") pod \"ovn-controller-6zzxt\" (UID: \"0b85434e-56f8-4cab-91a5-8cf0ea0356fc\") " pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.530975 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-lib\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531040 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-etc-ovs\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531088 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-log\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531139 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6nd\" (UniqueName: \"kubernetes.io/projected/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-kube-api-access-hz6nd\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531167 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-scripts\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-run\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531353 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-run\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531410 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-log\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531600 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-etc-ovs\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.531686 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-var-lib\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.533823 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-scripts\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.546785 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6nd\" (UniqueName: \"kubernetes.io/projected/8e4e9056-1f05-4fc5-b1a1-e578abbc24c6-kube-api-access-hz6nd\") pod \"ovn-controller-ovs-5czp2\" (UID: \"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6\") " pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.587421 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.588635 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.590805 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.591171 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.593485 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.593669 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jpzwz" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.593818 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.609142 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.629280 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.659356 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.734646 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735074 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79lc\" (UniqueName: \"kubernetes.io/projected/25b0cd0b-8d96-4067-a1da-171e5f0b9545-kube-api-access-h79lc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735122 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735142 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735158 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-config\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735234 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735295 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.735312 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837143 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837193 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837237 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h79lc\" (UniqueName: \"kubernetes.io/projected/25b0cd0b-8d96-4067-a1da-171e5f0b9545-kube-api-access-h79lc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837295 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837327 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-config\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.837397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.840015 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.840268 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.841255 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.842390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b0cd0b-8d96-4067-a1da-171e5f0b9545-config\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.848651 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.849528 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.853232 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25b0cd0b-8d96-4067-a1da-171e5f0b9545-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.859385 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79lc\" (UniqueName: \"kubernetes.io/projected/25b0cd0b-8d96-4067-a1da-171e5f0b9545-kube-api-access-h79lc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.868455 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"25b0cd0b-8d96-4067-a1da-171e5f0b9545\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:01 crc kubenswrapper[4869]: I0218 06:04:01.916003 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.074734 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.075037 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqz7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ssqcs_openstack(5147618e-26e5-44b5-a963-be10c5d6513b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.076727 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" podUID="5147618e-26e5-44b5-a963-be10c5d6513b" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.090161 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.090293 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x9pwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-l7tx6_openstack(6e774cc7-59dc-4b84-b0b7-4d2171a154c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:05 crc kubenswrapper[4869]: E0218 06:04:05.091591 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" podUID="6e774cc7-59dc-4b84-b0b7-4d2171a154c1" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.516677 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.523024 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.528999 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.530355 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.530554 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.530727 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mrtzd" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.537882 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.605565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606062 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606109 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606335 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606387 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606514 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.606753 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jpkk\" (UniqueName: \"kubernetes.io/projected/0991303d-6180-44e1-9baa-88ece3cdbfaf-kube-api-access-5jpkk\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.617111 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725006 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jpkk\" (UniqueName: \"kubernetes.io/projected/0991303d-6180-44e1-9baa-88ece3cdbfaf-kube-api-access-5jpkk\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725103 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725130 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725156 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725183 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725207 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725230 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725256 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.725909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.727666 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.728149 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.728643 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0991303d-6180-44e1-9baa-88ece3cdbfaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.732767 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.733294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.737103 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0991303d-6180-44e1-9baa-88ece3cdbfaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.747528 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.749445 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jpkk\" (UniqueName: \"kubernetes.io/projected/0991303d-6180-44e1-9baa-88ece3cdbfaf-kube-api-access-5jpkk\") pod \"ovsdbserver-sb-0\" (UID: \"0991303d-6180-44e1-9baa-88ece3cdbfaf\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.862905 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:05 crc kubenswrapper[4869]: W0218 06:04:05.985939 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23dc38a3_9ce0_4f1f_9495_2dc65f2474e5.slice/crio-9a21bec8c5c6a8b9591fe054653a566c026c1b0058ee3b581df044db6f2b62d8 WatchSource:0}: Error finding container 9a21bec8c5c6a8b9591fe054653a566c026c1b0058ee3b581df044db6f2b62d8: Status 404 returned error can't find the container with id 9a21bec8c5c6a8b9591fe054653a566c026c1b0058ee3b581df044db6f2b62d8 Feb 18 06:04:05 crc kubenswrapper[4869]: I0218 06:04:05.992904 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:04:05 crc kubenswrapper[4869]: W0218 06:04:05.999131 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3620bb0b_e8d0_4dd0_a2f1_6960f4ca8e90.slice/crio-cbff143e98c3b11a108fb99da87eed7d495bdf541a0319cc73cb3c691cf43ea3 WatchSource:0}: Error finding container cbff143e98c3b11a108fb99da87eed7d495bdf541a0319cc73cb3c691cf43ea3: Status 404 returned error can't find the container with id cbff143e98c3b11a108fb99da87eed7d495bdf541a0319cc73cb3c691cf43ea3 Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.016714 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.032290 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.054827 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.135803 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqz7f\" (UniqueName: \"kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f\") pod \"5147618e-26e5-44b5-a963-be10c5d6513b\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.136651 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config\") pod \"5147618e-26e5-44b5-a963-be10c5d6513b\" (UID: \"5147618e-26e5-44b5-a963-be10c5d6513b\") " Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.137705 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config" (OuterVolumeSpecName: "config") pod "5147618e-26e5-44b5-a963-be10c5d6513b" (UID: "5147618e-26e5-44b5-a963-be10c5d6513b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.146927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f" (OuterVolumeSpecName: "kube-api-access-nqz7f") pod "5147618e-26e5-44b5-a963-be10c5d6513b" (UID: "5147618e-26e5-44b5-a963-be10c5d6513b"). InnerVolumeSpecName "kube-api-access-nqz7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.162818 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6zzxt"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.190712 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.238825 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5147618e-26e5-44b5-a963-be10c5d6513b-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.238868 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqz7f\" (UniqueName: \"kubernetes.io/projected/5147618e-26e5-44b5-a963-be10c5d6513b-kube-api-access-nqz7f\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.240820 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: W0218 06:04:06.243018 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c89f7e_0259_4c42_9e24_cd8391cda1a3.slice/crio-16729965e142c5d8cdb0970fd1a43ba52d9a99b3bd21392996a42c3e261e63fa WatchSource:0}: Error finding container 16729965e142c5d8cdb0970fd1a43ba52d9a99b3bd21392996a42c3e261e63fa: Status 404 returned error can't find the container with id 16729965e142c5d8cdb0970fd1a43ba52d9a99b3bd21392996a42c3e261e63fa Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.247791 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: W0218 06:04:06.265343 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod128630be_af69_4db6_bad0_59f17dc9dec0.slice/crio-d04413323b249d5047f79747ce1a1897734459bfac4161d23c6ce5ca1db5463a WatchSource:0}: Error finding container d04413323b249d5047f79747ce1a1897734459bfac4161d23c6ce5ca1db5463a: Status 404 returned error can't find the container with id d04413323b249d5047f79747ce1a1897734459bfac4161d23c6ce5ca1db5463a Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.340574 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config\") pod \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.341125 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9pwv\" (UniqueName: \"kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv\") pod \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.341181 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc\") pod \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\" (UID: \"6e774cc7-59dc-4b84-b0b7-4d2171a154c1\") " Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.341651 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config" (OuterVolumeSpecName: "config") pod "6e774cc7-59dc-4b84-b0b7-4d2171a154c1" (UID: "6e774cc7-59dc-4b84-b0b7-4d2171a154c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.342160 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e774cc7-59dc-4b84-b0b7-4d2171a154c1" (UID: "6e774cc7-59dc-4b84-b0b7-4d2171a154c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.349440 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv" (OuterVolumeSpecName: "kube-api-access-x9pwv") pod "6e774cc7-59dc-4b84-b0b7-4d2171a154c1" (UID: "6e774cc7-59dc-4b84-b0b7-4d2171a154c1"). InnerVolumeSpecName "kube-api-access-x9pwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.358476 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.410114 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerStarted","Data":"cbff143e98c3b11a108fb99da87eed7d495bdf541a0319cc73cb3c691cf43ea3"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.412472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"25b0cd0b-8d96-4067-a1da-171e5f0b9545","Type":"ContainerStarted","Data":"11c4504f755cafb24bbba0fd460c85ea64db42e1792cac99ad3bc417a0f54943"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.414086 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6c89f7e-0259-4c42-9e24-cd8391cda1a3","Type":"ContainerStarted","Data":"16729965e142c5d8cdb0970fd1a43ba52d9a99b3bd21392996a42c3e261e63fa"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.415623 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerStarted","Data":"1cbbb86a3d8c4cbfb41f93e07d9da23bcfa04b2856c728a8b595c1d0730ce287"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.418536 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerID="85bdfad990098054cacb33bd58daf0b06f02de291e8abaf24581148c7961494d" exitCode=0 Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.418606 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" event={"ID":"bb0523a5-e2bc-45f8-aff2-de770dcf88b2","Type":"ContainerDied","Data":"85bdfad990098054cacb33bd58daf0b06f02de291e8abaf24581148c7961494d"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.421442 4869 generic.go:334] "Generic (PLEG): container finished" podID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerID="b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e" exitCode=0 Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.421551 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" event={"ID":"5935f943-3934-43be-a2c9-6b12d9cb8188","Type":"ContainerDied","Data":"b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.425687 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.425672 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ssqcs" event={"ID":"5147618e-26e5-44b5-a963-be10c5d6513b","Type":"ContainerDied","Data":"0a8cd41d55277bba49fd85b9c66d083897e57fbb1d8cbff3a34df2dd6d9c009f"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.428268 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"128630be-af69-4db6-bad0-59f17dc9dec0","Type":"ContainerStarted","Data":"d04413323b249d5047f79747ce1a1897734459bfac4161d23c6ce5ca1db5463a"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.429964 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6zzxt" event={"ID":"0b85434e-56f8-4cab-91a5-8cf0ea0356fc","Type":"ContainerStarted","Data":"33b5e3fb61156d20fe4ef18af3606088a92070c9677fa8956afb34611561cdda"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.431944 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" event={"ID":"6e774cc7-59dc-4b84-b0b7-4d2171a154c1","Type":"ContainerDied","Data":"15912416a158ac601772a24caefe70661b6dbcb6490548439db776e072bf7a6c"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.432015 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l7tx6" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.434155 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5","Type":"ContainerStarted","Data":"9a21bec8c5c6a8b9591fe054653a566c026c1b0058ee3b581df044db6f2b62d8"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.435426 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32064888-24ad-482d-ba16-36bfb48b069e","Type":"ContainerStarted","Data":"35e2f96b941ffec3c93cde6e9a4001490cbd0a0cfe117009919056bf9a61a05c"} Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.442788 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.442821 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9pwv\" (UniqueName: \"kubernetes.io/projected/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-kube-api-access-x9pwv\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.442835 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e774cc7-59dc-4b84-b0b7-4d2171a154c1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.557714 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.576831 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l7tx6"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.598791 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.611806 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ssqcs"] Feb 18 06:04:06 crc kubenswrapper[4869]: I0218 06:04:06.616818 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:06 crc kubenswrapper[4869]: E0218 06:04:06.729580 4869 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 18 06:04:06 crc kubenswrapper[4869]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 06:04:06 crc kubenswrapper[4869]: > podSandboxID="0eee79ceaf576ec964448f0c09905a1f62771ee9b948bae313d0ab275db29304" Feb 18 06:04:06 crc kubenswrapper[4869]: E0218 06:04:06.730157 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 18 06:04:06 crc kubenswrapper[4869]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bt9lb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-dgrsz_openstack(5935f943-3934-43be-a2c9-6b12d9cb8188): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 06:04:06 crc kubenswrapper[4869]: > logger="UnhandledError" Feb 18 06:04:06 crc kubenswrapper[4869]: E0218 06:04:06.731610 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.435963 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5czp2"] Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.447000 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" event={"ID":"bb0523a5-e2bc-45f8-aff2-de770dcf88b2","Type":"ContainerStarted","Data":"aa23597f51eb49ef7f9286909c936650102b08b9ee0c4f8d5a03eca17ea72431"} Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.457042 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.493299 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" podStartSLOduration=7.013785564 podStartE2EDuration="15.493276533s" podCreationTimestamp="2026-02-18 06:03:52 +0000 UTC" firstStartedPulling="2026-02-18 06:03:56.812648482 +0000 UTC m=+933.981736714" lastFinishedPulling="2026-02-18 06:04:05.292139441 +0000 UTC m=+942.461227683" observedRunningTime="2026-02-18 06:04:07.475154993 +0000 UTC m=+944.644243215" watchObservedRunningTime="2026-02-18 06:04:07.493276533 +0000 UTC m=+944.662364765" Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.500547 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5147618e-26e5-44b5-a963-be10c5d6513b" path="/var/lib/kubelet/pods/5147618e-26e5-44b5-a963-be10c5d6513b/volumes" Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.501270 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e774cc7-59dc-4b84-b0b7-4d2171a154c1" path="/var/lib/kubelet/pods/6e774cc7-59dc-4b84-b0b7-4d2171a154c1/volumes" Feb 18 06:04:07 crc kubenswrapper[4869]: I0218 06:04:07.502226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0991303d-6180-44e1-9baa-88ece3cdbfaf","Type":"ContainerStarted","Data":"4ddab94a0fe2ceeea4416456b7df3d1ba14a51ea4a0251ae91ebd0f220578fbe"} Feb 18 06:04:07 crc kubenswrapper[4869]: W0218 06:04:07.698116 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e4e9056_1f05_4fc5_b1a1_e578abbc24c6.slice/crio-c7aa41cf3dc4108423cd98848af021bfcd759a36ae4d30b85b5106a09522ce77 WatchSource:0}: Error finding container c7aa41cf3dc4108423cd98848af021bfcd759a36ae4d30b85b5106a09522ce77: Status 404 returned error can't find the container with id c7aa41cf3dc4108423cd98848af021bfcd759a36ae4d30b85b5106a09522ce77 Feb 18 06:04:08 crc kubenswrapper[4869]: I0218 06:04:08.505269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5czp2" event={"ID":"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6","Type":"ContainerStarted","Data":"c7aa41cf3dc4108423cd98848af021bfcd759a36ae4d30b85b5106a09522ce77"} Feb 18 06:04:10 crc kubenswrapper[4869]: I0218 06:04:10.132983 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:04:10 crc kubenswrapper[4869]: I0218 06:04:10.133358 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:04:10 crc kubenswrapper[4869]: I0218 06:04:10.133406 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:04:10 crc kubenswrapper[4869]: I0218 06:04:10.134278 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:04:10 crc kubenswrapper[4869]: I0218 06:04:10.134346 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca" gracePeriod=600 Feb 18 06:04:11 crc kubenswrapper[4869]: I0218 06:04:11.530523 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca" exitCode=0 Feb 18 06:04:11 crc kubenswrapper[4869]: I0218 06:04:11.530593 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca"} Feb 18 06:04:11 crc kubenswrapper[4869]: I0218 06:04:11.530881 4869 scope.go:117] "RemoveContainer" containerID="7562193726eefe80121fb4b3382e37ca22d430274c9cd86ef820a8666e2ec8f9" Feb 18 06:04:12 crc kubenswrapper[4869]: I0218 06:04:12.483942 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:04:12 crc kubenswrapper[4869]: I0218 06:04:12.591592 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:04:15 crc kubenswrapper[4869]: I0218 06:04:15.599958 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.612324 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32064888-24ad-482d-ba16-36bfb48b069e","Type":"ContainerStarted","Data":"4f749dfa5d494de82a87664df065ff28ec70b47d9cd264e6d13dc0ef9bb816bf"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.614068 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"25b0cd0b-8d96-4067-a1da-171e5f0b9545","Type":"ContainerStarted","Data":"022ae557735db181447f0347595c96326c771bfbbaed6035f10c11de0912db83"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.616355 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5","Type":"ContainerStarted","Data":"62f354a9b63e62a04e73d1d162cd78b3e8b91091dc7977b1d36941ef1d3999d1"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.620534 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6c89f7e-0259-4c42-9e24-cd8391cda1a3","Type":"ContainerStarted","Data":"4ec85a6695e832622951b1fcdbcc48a75f5e622b9fa75119d4c5a85aa814cd84"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.620601 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.622715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerStarted","Data":"00eeaaab546dfd1b221345a3f48c9875681142cf110a173f4094afe2a48c845c"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.624558 4869 generic.go:334] "Generic (PLEG): container finished" podID="8e4e9056-1f05-4fc5-b1a1-e578abbc24c6" containerID="048009d47ca6a011c644a7e2b26f56f6fd4a66afa5299cbbb6182ef03f8752c7" exitCode=0 Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.624603 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5czp2" event={"ID":"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6","Type":"ContainerDied","Data":"048009d47ca6a011c644a7e2b26f56f6fd4a66afa5299cbbb6182ef03f8752c7"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.626523 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6zzxt" event={"ID":"0b85434e-56f8-4cab-91a5-8cf0ea0356fc","Type":"ContainerStarted","Data":"dbb41a92622a08ebadb8450a511d56b04b8debde1c1b61917a0eea6c79cb84a0"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.626661 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.628484 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0991303d-6180-44e1-9baa-88ece3cdbfaf","Type":"ContainerStarted","Data":"96b1935f8abfa0eae79f7671d82b10db375c7c701abd5e1d371b5b7505965a3e"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.632032 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" event={"ID":"5935f943-3934-43be-a2c9-6b12d9cb8188","Type":"ContainerStarted","Data":"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.633974 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="dnsmasq-dns" containerID="cri-o://6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d" gracePeriod=10 Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.634127 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.640184 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"128630be-af69-4db6-bad0-59f17dc9dec0","Type":"ContainerStarted","Data":"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6"} Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.640337 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.679711 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6zzxt" podStartSLOduration=7.099074893 podStartE2EDuration="15.67969148s" podCreationTimestamp="2026-02-18 06:04:01 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.248314314 +0000 UTC m=+943.417402536" lastFinishedPulling="2026-02-18 06:04:14.828930851 +0000 UTC m=+951.998019123" observedRunningTime="2026-02-18 06:04:16.679034545 +0000 UTC m=+953.848122777" watchObservedRunningTime="2026-02-18 06:04:16.67969148 +0000 UTC m=+953.848779712" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.765551 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" podStartSLOduration=13.151665331 podStartE2EDuration="25.765530365s" podCreationTimestamp="2026-02-18 06:03:51 +0000 UTC" firstStartedPulling="2026-02-18 06:03:52.672030726 +0000 UTC m=+929.841118958" lastFinishedPulling="2026-02-18 06:04:05.28589576 +0000 UTC m=+942.454983992" observedRunningTime="2026-02-18 06:04:16.760454753 +0000 UTC m=+953.929542995" watchObservedRunningTime="2026-02-18 06:04:16.765530365 +0000 UTC m=+953.934618597" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.798927 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.447235811 podStartE2EDuration="20.798904896s" podCreationTimestamp="2026-02-18 06:03:56 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.24691613 +0000 UTC m=+943.416004362" lastFinishedPulling="2026-02-18 06:04:14.598585195 +0000 UTC m=+951.767673447" observedRunningTime="2026-02-18 06:04:16.781779881 +0000 UTC m=+953.950868113" watchObservedRunningTime="2026-02-18 06:04:16.798904896 +0000 UTC m=+953.967993128" Feb 18 06:04:16 crc kubenswrapper[4869]: I0218 06:04:16.804315 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.794118994 podStartE2EDuration="18.804297878s" podCreationTimestamp="2026-02-18 06:03:58 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.270949604 +0000 UTC m=+943.440037836" lastFinishedPulling="2026-02-18 06:04:15.281128498 +0000 UTC m=+952.450216720" observedRunningTime="2026-02-18 06:04:16.796493099 +0000 UTC m=+953.965581331" watchObservedRunningTime="2026-02-18 06:04:16.804297878 +0000 UTC m=+953.973386110" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.524648 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.700806 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") pod \"5935f943-3934-43be-a2c9-6b12d9cb8188\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.700910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc\") pod \"5935f943-3934-43be-a2c9-6b12d9cb8188\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.701174 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt9lb\" (UniqueName: \"kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb\") pod \"5935f943-3934-43be-a2c9-6b12d9cb8188\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.725038 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb" (OuterVolumeSpecName: "kube-api-access-bt9lb") pod "5935f943-3934-43be-a2c9-6b12d9cb8188" (UID: "5935f943-3934-43be-a2c9-6b12d9cb8188"). InnerVolumeSpecName "kube-api-access-bt9lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.729732 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5czp2" event={"ID":"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6","Type":"ContainerStarted","Data":"7c37b44d57f237f838c4f16ab8223862fef939eb77cc0605fa6d541b58c67d5f"} Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.732799 4869 generic.go:334] "Generic (PLEG): container finished" podID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerID="6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d" exitCode=0 Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.732851 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.732850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" event={"ID":"5935f943-3934-43be-a2c9-6b12d9cb8188","Type":"ContainerDied","Data":"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d"} Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.733403 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dgrsz" event={"ID":"5935f943-3934-43be-a2c9-6b12d9cb8188","Type":"ContainerDied","Data":"0eee79ceaf576ec964448f0c09905a1f62771ee9b948bae313d0ab275db29304"} Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.733475 4869 scope.go:117] "RemoveContainer" containerID="6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.740689 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerStarted","Data":"1683808ddc02a9672a20148fe3fe1be88215576ea37bf98a1fdafb9845128cd9"} Feb 18 06:04:17 crc kubenswrapper[4869]: E0218 06:04:17.759218 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config podName:5935f943-3934-43be-a2c9-6b12d9cb8188 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:18.259195339 +0000 UTC m=+955.428283571 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config") pod "5935f943-3934-43be-a2c9-6b12d9cb8188" (UID: "5935f943-3934-43be-a2c9-6b12d9cb8188") : error deleting /var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volume-subpaths: remove /var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volume-subpaths: no such file or directory Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.759489 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5935f943-3934-43be-a2c9-6b12d9cb8188" (UID: "5935f943-3934-43be-a2c9-6b12d9cb8188"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.802888 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.802922 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt9lb\" (UniqueName: \"kubernetes.io/projected/5935f943-3934-43be-a2c9-6b12d9cb8188-kube-api-access-bt9lb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:17 crc kubenswrapper[4869]: I0218 06:04:17.868660 4869 scope.go:117] "RemoveContainer" containerID="b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e" Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.078620 4869 scope.go:117] "RemoveContainer" containerID="6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d" Feb 18 06:04:18 crc kubenswrapper[4869]: E0218 06:04:18.079364 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d\": container with ID starting with 6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d not found: ID does not exist" containerID="6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d" Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.079390 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d"} err="failed to get container status \"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d\": rpc error: code = NotFound desc = could not find container \"6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d\": container with ID starting with 6cd0fb03e237d6d7b6e5cb6495ec1ec8e0673ed82a7f86961e007d14cf0d064d not found: ID does not exist" Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.079413 4869 scope.go:117] "RemoveContainer" containerID="b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e" Feb 18 06:04:18 crc kubenswrapper[4869]: E0218 06:04:18.080887 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e\": container with ID starting with b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e not found: ID does not exist" containerID="b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e" Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.080918 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e"} err="failed to get container status \"b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e\": rpc error: code = NotFound desc = could not find container \"b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e\": container with ID starting with b72f94f089437e9c54c63b4afdc311c1a5f3e01b3eeb72681cbd9af5b487304e not found: ID does not exist" Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.343273 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") pod \"5935f943-3934-43be-a2c9-6b12d9cb8188\" (UID: \"5935f943-3934-43be-a2c9-6b12d9cb8188\") " Feb 18 06:04:18 crc kubenswrapper[4869]: I0218 06:04:18.344016 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config" (OuterVolumeSpecName: "config") pod "5935f943-3934-43be-a2c9-6b12d9cb8188" (UID: "5935f943-3934-43be-a2c9-6b12d9cb8188"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:19 crc kubenswrapper[4869]: I0218 06:04:19.013549 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5935f943-3934-43be-a2c9-6b12d9cb8188-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:19 crc kubenswrapper[4869]: I0218 06:04:19.111214 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:04:19 crc kubenswrapper[4869]: I0218 06:04:19.119295 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dgrsz"] Feb 18 06:04:19 crc kubenswrapper[4869]: I0218 06:04:19.487526 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" path="/var/lib/kubelet/pods/5935f943-3934-43be-a2c9-6b12d9cb8188/volumes" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.054320 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"25b0cd0b-8d96-4067-a1da-171e5f0b9545","Type":"ContainerStarted","Data":"ce06f6daf4a5c535207bdb1cea76d503fdc3be1cb03bf1f69023d20bfaa81a12"} Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.062066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5czp2" event={"ID":"8e4e9056-1f05-4fc5-b1a1-e578abbc24c6","Type":"ContainerStarted","Data":"62681aa4277f2bbb2e2b9c276912ffe2def5a4d2730eeddbdca5986ad4980a5c"} Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.062178 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.062413 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.065508 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0991303d-6180-44e1-9baa-88ece3cdbfaf","Type":"ContainerStarted","Data":"1ce0fdf307709e448b10cb38d5e19d3348b5f50585963cc739d2290bab80bb9c"} Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.067584 4869 generic.go:334] "Generic (PLEG): container finished" podID="32064888-24ad-482d-ba16-36bfb48b069e" containerID="4f749dfa5d494de82a87664df065ff28ec70b47d9cd264e6d13dc0ef9bb816bf" exitCode=0 Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.067658 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32064888-24ad-482d-ba16-36bfb48b069e","Type":"ContainerDied","Data":"4f749dfa5d494de82a87664df065ff28ec70b47d9cd264e6d13dc0ef9bb816bf"} Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.092662 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.389875908 podStartE2EDuration="20.092619676s" podCreationTimestamp="2026-02-18 06:04:00 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.387576767 +0000 UTC m=+943.556664989" lastFinishedPulling="2026-02-18 06:04:18.090320525 +0000 UTC m=+955.259408757" observedRunningTime="2026-02-18 06:04:20.083038643 +0000 UTC m=+957.252126875" watchObservedRunningTime="2026-02-18 06:04:20.092619676 +0000 UTC m=+957.261707948" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.141104 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.668755094 podStartE2EDuration="16.141073792s" podCreationTimestamp="2026-02-18 06:04:04 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.638963996 +0000 UTC m=+943.808052228" lastFinishedPulling="2026-02-18 06:04:18.111282694 +0000 UTC m=+955.280370926" observedRunningTime="2026-02-18 06:04:20.135924658 +0000 UTC m=+957.305012980" watchObservedRunningTime="2026-02-18 06:04:20.141073792 +0000 UTC m=+957.310162034" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.199308 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5czp2" podStartSLOduration=12.159520879 podStartE2EDuration="19.199285127s" podCreationTimestamp="2026-02-18 06:04:01 +0000 UTC" firstStartedPulling="2026-02-18 06:04:07.701958094 +0000 UTC m=+944.871046326" lastFinishedPulling="2026-02-18 06:04:14.741722342 +0000 UTC m=+951.910810574" observedRunningTime="2026-02-18 06:04:20.197579006 +0000 UTC m=+957.366667248" watchObservedRunningTime="2026-02-18 06:04:20.199285127 +0000 UTC m=+957.368373359" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.863826 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.863890 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:20 crc kubenswrapper[4869]: I0218 06:04:20.906406 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.076481 4869 generic.go:334] "Generic (PLEG): container finished" podID="23dc38a3-9ce0-4f1f-9495-2dc65f2474e5" containerID="62f354a9b63e62a04e73d1d162cd78b3e8b91091dc7977b1d36941ef1d3999d1" exitCode=0 Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.076581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5","Type":"ContainerDied","Data":"62f354a9b63e62a04e73d1d162cd78b3e8b91091dc7977b1d36941ef1d3999d1"} Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.079715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32064888-24ad-482d-ba16-36bfb48b069e","Type":"ContainerStarted","Data":"c3bba7f64b959b2542046f67bb598585ae6e7285c911d2a4e4f5d64faf085e8a"} Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.136284 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.166396367 podStartE2EDuration="28.136258163s" podCreationTimestamp="2026-02-18 06:03:53 +0000 UTC" firstStartedPulling="2026-02-18 06:04:05.628721459 +0000 UTC m=+942.797809691" lastFinishedPulling="2026-02-18 06:04:14.598583255 +0000 UTC m=+951.767671487" observedRunningTime="2026-02-18 06:04:21.136166301 +0000 UTC m=+958.305254553" watchObservedRunningTime="2026-02-18 06:04:21.136258163 +0000 UTC m=+958.305346415" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.153683 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.426479 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.468057 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:21 crc kubenswrapper[4869]: E0218 06:04:21.468396 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="init" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.468417 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="init" Feb 18 06:04:21 crc kubenswrapper[4869]: E0218 06:04:21.468439 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="dnsmasq-dns" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.468449 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="dnsmasq-dns" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.468677 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5935f943-3934-43be-a2c9-6b12d9cb8188" containerName="dnsmasq-dns" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.475664 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.478730 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.498699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.498945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.499036 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.499086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.504315 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.531138 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-47kkq"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.532132 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.536586 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.563583 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-47kkq"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-combined-ca-bundle\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599732 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovn-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599768 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29fs\" (UniqueName: \"kubernetes.io/projected/7cb275de-f7e9-434d-b934-37dfb39e92ac-kube-api-access-q29fs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599826 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599851 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599887 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovs-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599930 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599974 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.599994 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb275de-f7e9-434d-b934-37dfb39e92ac-config\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.600911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.600980 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.601416 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.619104 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf\") pod \"dnsmasq-dns-7f896c8c65-hvt9b\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701363 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb275de-f7e9-434d-b934-37dfb39e92ac-config\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701411 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-combined-ca-bundle\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701441 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovn-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701456 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29fs\" (UniqueName: \"kubernetes.io/projected/7cb275de-f7e9-434d-b934-37dfb39e92ac-kube-api-access-q29fs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701500 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701533 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovs-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701877 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovs-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.701993 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cb275de-f7e9-434d-b934-37dfb39e92ac-ovn-rundir\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.702973 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb275de-f7e9-434d-b934-37dfb39e92ac-config\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.736449 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.740982 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb275de-f7e9-434d-b934-37dfb39e92ac-combined-ca-bundle\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.742704 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29fs\" (UniqueName: \"kubernetes.io/projected/7cb275de-f7e9-434d-b934-37dfb39e92ac-kube-api-access-q29fs\") pod \"ovn-controller-metrics-47kkq\" (UID: \"7cb275de-f7e9-434d-b934-37dfb39e92ac\") " pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.799798 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.825145 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.849621 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-47kkq" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.851939 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.855123 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.859179 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.875041 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.905887 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.905986 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.906031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.906200 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.906314 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcxd\" (UniqueName: \"kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:21 crc kubenswrapper[4869]: I0218 06:04:21.917584 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.007840 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.008260 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.008368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.008405 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcxd\" (UniqueName: \"kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.009697 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.009296 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.009657 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.008830 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.010518 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.027804 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcxd\" (UniqueName: \"kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd\") pod \"dnsmasq-dns-86db49b7ff-8lrfw\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.088828 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"23dc38a3-9ce0-4f1f-9495-2dc65f2474e5","Type":"ContainerStarted","Data":"090a36567c873f6dbcc1a4d9489ad1b3ba0f617bd3a0ef788fee4b83cf279720"} Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.126336 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.942025223999998 podStartE2EDuration="28.126316349s" podCreationTimestamp="2026-02-18 06:03:54 +0000 UTC" firstStartedPulling="2026-02-18 06:04:05.999261343 +0000 UTC m=+943.168349575" lastFinishedPulling="2026-02-18 06:04:15.183552468 +0000 UTC m=+952.352640700" observedRunningTime="2026-02-18 06:04:22.121056241 +0000 UTC m=+959.290144473" watchObservedRunningTime="2026-02-18 06:04:22.126316349 +0000 UTC m=+959.295404581" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.182444 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.304883 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:22 crc kubenswrapper[4869]: W0218 06:04:22.312987 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63de375_f72f_4468_9a2a_b1c56c30eabe.slice/crio-2884d5b4a53d0a96a111f978e2e456997edc16c2c29ec65127f1f7daa2dc1d5f WatchSource:0}: Error finding container 2884d5b4a53d0a96a111f978e2e456997edc16c2c29ec65127f1f7daa2dc1d5f: Status 404 returned error can't find the container with id 2884d5b4a53d0a96a111f978e2e456997edc16c2c29ec65127f1f7daa2dc1d5f Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.412922 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-47kkq"] Feb 18 06:04:22 crc kubenswrapper[4869]: W0218 06:04:22.415959 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb275de_f7e9_434d_b934_37dfb39e92ac.slice/crio-b0585aff26d11055e8424db6944a65ec1f0f761db7a91de40a447f87917887e0 WatchSource:0}: Error finding container b0585aff26d11055e8424db6944a65ec1f0f761db7a91de40a447f87917887e0: Status 404 returned error can't find the container with id b0585aff26d11055e8424db6944a65ec1f0f761db7a91de40a447f87917887e0 Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.645562 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:22 crc kubenswrapper[4869]: W0218 06:04:22.653113 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cb5aea3_1e80_4cb0_bae7_7d9be7a77841.slice/crio-37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6 WatchSource:0}: Error finding container 37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6: Status 404 returned error can't find the container with id 37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6 Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.917101 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:22 crc kubenswrapper[4869]: I0218 06:04:22.961921 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.101366 4869 generic.go:334] "Generic (PLEG): container finished" podID="c63de375-f72f-4468-9a2a-b1c56c30eabe" containerID="59b61e5c4fa7cad567c5ad91bba872ecb4d0db819d80ffb8bd23c15e07050f5a" exitCode=0 Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.101444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" event={"ID":"c63de375-f72f-4468-9a2a-b1c56c30eabe","Type":"ContainerDied","Data":"59b61e5c4fa7cad567c5ad91bba872ecb4d0db819d80ffb8bd23c15e07050f5a"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.101485 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" event={"ID":"c63de375-f72f-4468-9a2a-b1c56c30eabe","Type":"ContainerStarted","Data":"2884d5b4a53d0a96a111f978e2e456997edc16c2c29ec65127f1f7daa2dc1d5f"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.105000 4869 generic.go:334] "Generic (PLEG): container finished" podID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerID="f7fd390b28ec4651f960a40c7e41e2c84ed4e8119f7af25eb768835994beee99" exitCode=0 Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.105840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" event={"ID":"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841","Type":"ContainerDied","Data":"f7fd390b28ec4651f960a40c7e41e2c84ed4e8119f7af25eb768835994beee99"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.105932 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" event={"ID":"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841","Type":"ContainerStarted","Data":"37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.110897 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-47kkq" event={"ID":"7cb275de-f7e9-434d-b934-37dfb39e92ac","Type":"ContainerStarted","Data":"4b4660cf5e37a8bd90eb674e4187616d4b9daab71c37ae3c42fbb3db5ecd2be5"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.110963 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-47kkq" event={"ID":"7cb275de-f7e9-434d-b934-37dfb39e92ac","Type":"ContainerStarted","Data":"b0585aff26d11055e8424db6944a65ec1f0f761db7a91de40a447f87917887e0"} Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.162706 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-47kkq" podStartSLOduration=2.16267789 podStartE2EDuration="2.16267789s" podCreationTimestamp="2026-02-18 06:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:23.153698592 +0000 UTC m=+960.322786824" watchObservedRunningTime="2026-02-18 06:04:23.16267789 +0000 UTC m=+960.331766132" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.248356 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.495917 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.636608 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:23 crc kubenswrapper[4869]: E0218 06:04:23.637049 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63de375-f72f-4468-9a2a-b1c56c30eabe" containerName="init" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.637066 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63de375-f72f-4468-9a2a-b1c56c30eabe" containerName="init" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.637294 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63de375-f72f-4468-9a2a-b1c56c30eabe" containerName="init" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.639417 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.644892 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.644955 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.644978 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rqf29" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.645087 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config\") pod \"c63de375-f72f-4468-9a2a-b1c56c30eabe\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.645236 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc\") pod \"c63de375-f72f-4468-9a2a-b1c56c30eabe\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.645334 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb\") pod \"c63de375-f72f-4468-9a2a-b1c56c30eabe\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.645403 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf\") pod \"c63de375-f72f-4468-9a2a-b1c56c30eabe\" (UID: \"c63de375-f72f-4468-9a2a-b1c56c30eabe\") " Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.645656 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.669099 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.690425 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf" (OuterVolumeSpecName: "kube-api-access-2ghlf") pod "c63de375-f72f-4468-9a2a-b1c56c30eabe" (UID: "c63de375-f72f-4468-9a2a-b1c56c30eabe"). InnerVolumeSpecName "kube-api-access-2ghlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.692020 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c63de375-f72f-4468-9a2a-b1c56c30eabe" (UID: "c63de375-f72f-4468-9a2a-b1c56c30eabe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.692081 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config" (OuterVolumeSpecName: "config") pod "c63de375-f72f-4468-9a2a-b1c56c30eabe" (UID: "c63de375-f72f-4468-9a2a-b1c56c30eabe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.698472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c63de375-f72f-4468-9a2a-b1c56c30eabe" (UID: "c63de375-f72f-4468-9a2a-b1c56c30eabe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.747846 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.747917 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.747942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-config\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.748088 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-scripts\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.748510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.748664 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.749031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq7gc\" (UniqueName: \"kubernetes.io/projected/65ad1fc9-3393-4e57-9041-c17ef5279ddd-kube-api-access-vq7gc\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.749254 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.749276 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.749291 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c63de375-f72f-4468-9a2a-b1c56c30eabe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.749309 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/c63de375-f72f-4468-9a2a-b1c56c30eabe-kube-api-access-2ghlf\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850644 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850714 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850794 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq7gc\" (UniqueName: \"kubernetes.io/projected/65ad1fc9-3393-4e57-9041-c17ef5279ddd-kube-api-access-vq7gc\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850815 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850834 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-config\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.850889 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-scripts\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.851722 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-scripts\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.852140 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.852972 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ad1fc9-3393-4e57-9041-c17ef5279ddd-config\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.855270 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.855828 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.857112 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ad1fc9-3393-4e57-9041-c17ef5279ddd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.874706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq7gc\" (UniqueName: \"kubernetes.io/projected/65ad1fc9-3393-4e57-9041-c17ef5279ddd-kube-api-access-vq7gc\") pod \"ovn-northd-0\" (UID: \"65ad1fc9-3393-4e57-9041-c17ef5279ddd\") " pod="openstack/ovn-northd-0" Feb 18 06:04:23 crc kubenswrapper[4869]: I0218 06:04:23.993770 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.120587 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.120588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hvt9b" event={"ID":"c63de375-f72f-4468-9a2a-b1c56c30eabe","Type":"ContainerDied","Data":"2884d5b4a53d0a96a111f978e2e456997edc16c2c29ec65127f1f7daa2dc1d5f"} Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.120878 4869 scope.go:117] "RemoveContainer" containerID="59b61e5c4fa7cad567c5ad91bba872ecb4d0db819d80ffb8bd23c15e07050f5a" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.132003 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" event={"ID":"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841","Type":"ContainerStarted","Data":"4c41796edfd0b6b6aecb5cc67532abbdd93d18dc713953ffb4654d7a0eba0c9d"} Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.132046 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.174089 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" podStartSLOduration=3.174064154 podStartE2EDuration="3.174064154s" podCreationTimestamp="2026-02-18 06:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:24.15827454 +0000 UTC m=+961.327362802" watchObservedRunningTime="2026-02-18 06:04:24.174064154 +0000 UTC m=+961.343152386" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.216036 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.221449 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hvt9b"] Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.455949 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.780073 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.780150 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 18 06:04:24 crc kubenswrapper[4869]: I0218 06:04:24.871407 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 18 06:04:25 crc kubenswrapper[4869]: I0218 06:04:25.140246 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65ad1fc9-3393-4e57-9041-c17ef5279ddd","Type":"ContainerStarted","Data":"dbf9ca94b81c6c9fe30565353f7591b3c62483e56efea900652c08fbf0e9e711"} Feb 18 06:04:25 crc kubenswrapper[4869]: I0218 06:04:25.208069 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 18 06:04:25 crc kubenswrapper[4869]: I0218 06:04:25.483488 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63de375-f72f-4468-9a2a-b1c56c30eabe" path="/var/lib/kubelet/pods/c63de375-f72f-4468-9a2a-b1c56c30eabe/volumes" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.112786 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.113074 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.152940 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65ad1fc9-3393-4e57-9041-c17ef5279ddd","Type":"ContainerStarted","Data":"2a694c0175990dcbb61d6579d284b75bbc1796a613a53792f6615a0c25e42115"} Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.153311 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.153330 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65ad1fc9-3393-4e57-9041-c17ef5279ddd","Type":"ContainerStarted","Data":"9f7da7b6277e5f180b19233c08ba6c108963e5037635f71ecbfc70b9af28342d"} Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.178861 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.166105419 podStartE2EDuration="3.178828165s" podCreationTimestamp="2026-02-18 06:04:23 +0000 UTC" firstStartedPulling="2026-02-18 06:04:24.46814951 +0000 UTC m=+961.637237742" lastFinishedPulling="2026-02-18 06:04:25.480872256 +0000 UTC m=+962.649960488" observedRunningTime="2026-02-18 06:04:26.175494914 +0000 UTC m=+963.344583156" watchObservedRunningTime="2026-02-18 06:04:26.178828165 +0000 UTC m=+963.347916437" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.206210 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:26 crc kubenswrapper[4869]: I0218 06:04:26.295020 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.444869 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9a85-account-create-update-tbcd2"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.446260 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.448489 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.457600 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a85-account-create-update-tbcd2"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.482344 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sr5s4"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.483417 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.492828 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sr5s4"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.539467 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.539536 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kfj\" (UniqueName: \"kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.539582 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69b6\" (UniqueName: \"kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.539688 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.592672 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vsntf"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.594007 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.603666 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vsntf"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.642824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.642881 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kfj\" (UniqueName: \"kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.642919 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bs9n\" (UniqueName: \"kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.642947 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b69b6\" (UniqueName: \"kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.642970 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.643065 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.643925 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.643952 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.648846 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2d5b-account-create-update-7v5gt"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.650121 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.652841 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.668103 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2d5b-account-create-update-7v5gt"] Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.668617 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69b6\" (UniqueName: \"kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6\") pod \"keystone-9a85-account-create-update-tbcd2\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.684470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kfj\" (UniqueName: \"kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj\") pod \"keystone-db-create-sr5s4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.744675 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.745120 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhxsb\" (UniqueName: \"kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.745232 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.745339 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bs9n\" (UniqueName: \"kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.745727 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.763003 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bs9n\" (UniqueName: \"kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n\") pod \"placement-db-create-vsntf\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.793782 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.813122 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.846795 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhxsb\" (UniqueName: \"kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.846858 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.848292 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.867422 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhxsb\" (UniqueName: \"kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb\") pod \"placement-2d5b-account-create-update-7v5gt\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.908462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vsntf" Feb 18 06:04:27 crc kubenswrapper[4869]: I0218 06:04:27.964411 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.262651 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a85-account-create-update-tbcd2"] Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.363877 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sr5s4"] Feb 18 06:04:28 crc kubenswrapper[4869]: W0218 06:04:28.368279 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1adc4451_3322_4686_aeb5_ea4a457724a4.slice/crio-51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638 WatchSource:0}: Error finding container 51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638: Status 404 returned error can't find the container with id 51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638 Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.446842 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vsntf"] Feb 18 06:04:28 crc kubenswrapper[4869]: W0218 06:04:28.449292 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bd308c_0bb8_49f6_a629_8d24b1c5d55e.slice/crio-727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1 WatchSource:0}: Error finding container 727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1: Status 404 returned error can't find the container with id 727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1 Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.508863 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.542971 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2d5b-account-create-update-7v5gt"] Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.703488 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.703848 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="dnsmasq-dns" containerID="cri-o://4c41796edfd0b6b6aecb5cc67532abbdd93d18dc713953ffb4654d7a0eba0c9d" gracePeriod=10 Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.716705 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.809835 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.811246 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.822877 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.875759 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.876152 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.876180 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.876229 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6vz\" (UniqueName: \"kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.876256 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.977972 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6vz\" (UniqueName: \"kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.978024 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.978063 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.978127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.978151 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.978982 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.979049 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.979172 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:28 crc kubenswrapper[4869]: I0218 06:04:28.979417 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.000331 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6vz\" (UniqueName: \"kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz\") pod \"dnsmasq-dns-698758b865-rx7zr\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.175785 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.183416 4869 generic.go:334] "Generic (PLEG): container finished" podID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerID="4c41796edfd0b6b6aecb5cc67532abbdd93d18dc713953ffb4654d7a0eba0c9d" exitCode=0 Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.183477 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" event={"ID":"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841","Type":"ContainerDied","Data":"4c41796edfd0b6b6aecb5cc67532abbdd93d18dc713953ffb4654d7a0eba0c9d"} Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.185524 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d5b-account-create-update-7v5gt" event={"ID":"a2ae3a7e-91a4-4a0f-9537-d0438699a82f","Type":"ContainerStarted","Data":"3a42b6c917e0e474cae8fda604821a88ef3e65e17db6b67cbf3df4e0b351920c"} Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.190596 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a85-account-create-update-tbcd2" event={"ID":"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6","Type":"ContainerStarted","Data":"832ea9537194d4e26a69602738d9f3e26e1ae7cb22b0a95d539b36c5578d1a46"} Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.198229 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sr5s4" event={"ID":"1adc4451-3322-4686-aeb5-ea4a457724a4","Type":"ContainerStarted","Data":"51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638"} Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.202819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vsntf" event={"ID":"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e","Type":"ContainerStarted","Data":"727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1"} Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.705579 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:04:29 crc kubenswrapper[4869]: W0218 06:04:29.714180 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf2194de_4e6e_4d75_814f_480d69124118.slice/crio-3cbc467da7336a49b83d45874d562aa4b46be9e90e410b6ca05405081830f9d6 WatchSource:0}: Error finding container 3cbc467da7336a49b83d45874d562aa4b46be9e90e410b6ca05405081830f9d6: Status 404 returned error can't find the container with id 3cbc467da7336a49b83d45874d562aa4b46be9e90e410b6ca05405081830f9d6 Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.923758 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.931719 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.936147 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.936291 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.936452 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.938194 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-njsj6" Feb 18 06:04:29 crc kubenswrapper[4869]: I0218 06:04:29.955126 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002267 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s85l\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-kube-api-access-7s85l\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002329 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002462 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002496 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-lock\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002556 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.002580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-cache\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-lock\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104219 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-cache\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s85l\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-kube-api-access-7s85l\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104338 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104569 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.104789 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.104812 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.104875 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift podName:253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:30.604851668 +0000 UTC m=+967.773939900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift") pod "swift-storage-0" (UID: "253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624") : configmap "swift-ring-files" not found Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.104903 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.106580 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-cache\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.107140 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-lock\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.117025 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.125708 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s85l\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-kube-api-access-7s85l\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.142217 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.215908 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx7zr" event={"ID":"bf2194de-4e6e-4d75-814f-480d69124118","Type":"ContainerStarted","Data":"3cbc467da7336a49b83d45874d562aa4b46be9e90e410b6ca05405081830f9d6"} Feb 18 06:04:30 crc kubenswrapper[4869]: I0218 06:04:30.612954 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.613802 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.613890 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:04:30 crc kubenswrapper[4869]: E0218 06:04:30.614003 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift podName:253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:31.613981968 +0000 UTC m=+968.783070200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift") pod "swift-storage-0" (UID: "253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624") : configmap "swift-ring-files" not found Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.630275 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:31 crc kubenswrapper[4869]: E0218 06:04:31.630830 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:04:31 crc kubenswrapper[4869]: E0218 06:04:31.630850 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:04:31 crc kubenswrapper[4869]: E0218 06:04:31.630913 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift podName:253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:33.630889835 +0000 UTC m=+970.799978077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift") pod "swift-storage-0" (UID: "253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624") : configmap "swift-ring-files" not found Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.634085 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jhdcd"] Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.635328 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.667938 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhdcd"] Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.731968 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.732071 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkmr6\" (UniqueName: \"kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.739337 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-143a-account-create-update-n6nb8"] Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.741591 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.744131 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.750124 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-143a-account-create-update-n6nb8"] Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.833794 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.833863 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgm9\" (UniqueName: \"kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.833904 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.833968 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkmr6\" (UniqueName: \"kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.835442 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.854081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkmr6\" (UniqueName: \"kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6\") pod \"glance-db-create-jhdcd\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.935303 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.935377 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgm9\" (UniqueName: \"kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.936423 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.953364 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:31 crc kubenswrapper[4869]: I0218 06:04:31.972563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgm9\" (UniqueName: \"kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9\") pod \"glance-143a-account-create-update-n6nb8\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.060356 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.260277 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d5b-account-create-update-7v5gt" event={"ID":"a2ae3a7e-91a4-4a0f-9537-d0438699a82f","Type":"ContainerStarted","Data":"bc01ec2d48ab1e6ee0cbace913062ec9603c1593a443ca5b58e55e727c3f6b06"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.273788 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a85-account-create-update-tbcd2" event={"ID":"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6","Type":"ContainerStarted","Data":"664a9fe9e5f25c24bca12b867ce99a07d03c1cac1b1a472034074022186c981d"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.277338 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-2d5b-account-create-update-7v5gt" podStartSLOduration=5.277326933 podStartE2EDuration="5.277326933s" podCreationTimestamp="2026-02-18 06:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:32.273772357 +0000 UTC m=+969.442860589" watchObservedRunningTime="2026-02-18 06:04:32.277326933 +0000 UTC m=+969.446415175" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.295210 4869 generic.go:334] "Generic (PLEG): container finished" podID="bf2194de-4e6e-4d75-814f-480d69124118" containerID="c6dec9587fa8660ba7649c549d4f1d4bafc6c3c14e4fe0122fac7538fb2dedc0" exitCode=0 Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.295307 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx7zr" event={"ID":"bf2194de-4e6e-4d75-814f-480d69124118","Type":"ContainerDied","Data":"c6dec9587fa8660ba7649c549d4f1d4bafc6c3c14e4fe0122fac7538fb2dedc0"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.301769 4869 generic.go:334] "Generic (PLEG): container finished" podID="1adc4451-3322-4686-aeb5-ea4a457724a4" containerID="12a5b1032db1423dd4da6ba897ccbb8d42330c8210880d1e03c015befbaeb840" exitCode=0 Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.301837 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sr5s4" event={"ID":"1adc4451-3322-4686-aeb5-ea4a457724a4","Type":"ContainerDied","Data":"12a5b1032db1423dd4da6ba897ccbb8d42330c8210880d1e03c015befbaeb840"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.335801 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9a85-account-create-update-tbcd2" podStartSLOduration=5.335779483 podStartE2EDuration="5.335779483s" podCreationTimestamp="2026-02-18 06:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:32.296425147 +0000 UTC m=+969.465513379" watchObservedRunningTime="2026-02-18 06:04:32.335779483 +0000 UTC m=+969.504867715" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.343014 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vsntf" event={"ID":"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e","Type":"ContainerStarted","Data":"fb67abc41fd2a84058fb464f4447e3c9c87d18f4f134427df5cd99f94ec5fdaf"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.378165 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" event={"ID":"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841","Type":"ContainerDied","Data":"37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6"} Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.378222 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c18b4fbb720a70e828f64d8a6ea20cc11ad189db0b583f8a1c111cdb567fb6" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.415927 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.481388 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhdcd"] Feb 18 06:04:32 crc kubenswrapper[4869]: W0218 06:04:32.483353 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2268e717_5605_4efd_aed0_0323235e9211.slice/crio-b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33 WatchSource:0}: Error finding container b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33: Status 404 returned error can't find the container with id b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33 Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.557509 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc\") pod \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.557596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txcxd\" (UniqueName: \"kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd\") pod \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.557686 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb\") pod \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.557704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb\") pod \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.557728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config\") pod \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\" (UID: \"6cb5aea3-1e80-4cb0-bae7-7d9be7a77841\") " Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.579272 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd" (OuterVolumeSpecName: "kube-api-access-txcxd") pod "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" (UID: "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841"). InnerVolumeSpecName "kube-api-access-txcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.646930 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" (UID: "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.650500 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config" (OuterVolumeSpecName: "config") pod "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" (UID: "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.653842 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" (UID: "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.663834 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.663866 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.663875 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.663885 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txcxd\" (UniqueName: \"kubernetes.io/projected/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-kube-api-access-txcxd\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.673343 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-143a-account-create-update-n6nb8"] Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.679548 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" (UID: "6cb5aea3-1e80-4cb0-bae7-7d9be7a77841"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:32 crc kubenswrapper[4869]: I0218 06:04:32.765440 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.389003 4869 generic.go:334] "Generic (PLEG): container finished" podID="a2ae3a7e-91a4-4a0f-9537-d0438699a82f" containerID="bc01ec2d48ab1e6ee0cbace913062ec9603c1593a443ca5b58e55e727c3f6b06" exitCode=0 Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.389051 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d5b-account-create-update-7v5gt" event={"ID":"a2ae3a7e-91a4-4a0f-9537-d0438699a82f","Type":"ContainerDied","Data":"bc01ec2d48ab1e6ee0cbace913062ec9603c1593a443ca5b58e55e727c3f6b06"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.390887 4869 generic.go:334] "Generic (PLEG): container finished" podID="d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" containerID="664a9fe9e5f25c24bca12b867ce99a07d03c1cac1b1a472034074022186c981d" exitCode=0 Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.390964 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a85-account-create-update-tbcd2" event={"ID":"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6","Type":"ContainerDied","Data":"664a9fe9e5f25c24bca12b867ce99a07d03c1cac1b1a472034074022186c981d"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.393319 4869 generic.go:334] "Generic (PLEG): container finished" podID="aaf77221-4fa8-4a41-b019-042eb77a2553" containerID="2d62c10291ec70b24ec3577c01b2d046a3eaacd18541f9dc481a46ab7bd80493" exitCode=0 Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.393394 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-143a-account-create-update-n6nb8" event={"ID":"aaf77221-4fa8-4a41-b019-042eb77a2553","Type":"ContainerDied","Data":"2d62c10291ec70b24ec3577c01b2d046a3eaacd18541f9dc481a46ab7bd80493"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.393420 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-143a-account-create-update-n6nb8" event={"ID":"aaf77221-4fa8-4a41-b019-042eb77a2553","Type":"ContainerStarted","Data":"bdec1df4e343faf32dc0d108e22819e5d4c58f6daee695fa5d08a4afd61a0346"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.395513 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx7zr" event={"ID":"bf2194de-4e6e-4d75-814f-480d69124118","Type":"ContainerStarted","Data":"fa6bcbdc86a045336486a638bdc9d78200e2ed1d9973d63f4978d97356fa688c"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.396462 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.398447 4869 generic.go:334] "Generic (PLEG): container finished" podID="f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" containerID="fb67abc41fd2a84058fb464f4447e3c9c87d18f4f134427df5cd99f94ec5fdaf" exitCode=0 Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.398536 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vsntf" event={"ID":"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e","Type":"ContainerDied","Data":"fb67abc41fd2a84058fb464f4447e3c9c87d18f4f134427df5cd99f94ec5fdaf"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.400586 4869 generic.go:334] "Generic (PLEG): container finished" podID="2268e717-5605-4efd-aed0-0323235e9211" containerID="4539cebbd7f59990896a3013e3ad7a8002f56e3a5cfb0b1bcb495b2dc27b1f58" exitCode=0 Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.400682 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.403936 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhdcd" event={"ID":"2268e717-5605-4efd-aed0-0323235e9211","Type":"ContainerDied","Data":"4539cebbd7f59990896a3013e3ad7a8002f56e3a5cfb0b1bcb495b2dc27b1f58"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.403991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhdcd" event={"ID":"2268e717-5605-4efd-aed0-0323235e9211","Type":"ContainerStarted","Data":"b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33"} Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.443767 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-rx7zr" podStartSLOduration=5.443716393 podStartE2EDuration="5.443716393s" podCreationTimestamp="2026-02-18 06:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:33.440329521 +0000 UTC m=+970.609417753" watchObservedRunningTime="2026-02-18 06:04:33.443716393 +0000 UTC m=+970.612804625" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.445155 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-s9bpj"] Feb 18 06:04:33 crc kubenswrapper[4869]: E0218 06:04:33.445528 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="init" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.445545 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="init" Feb 18 06:04:33 crc kubenswrapper[4869]: E0218 06:04:33.445571 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="dnsmasq-dns" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.445578 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="dnsmasq-dns" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.445719 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="dnsmasq-dns" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.446357 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.448540 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.466972 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s9bpj"] Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.573025 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.578280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.578394 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwjfh\" (UniqueName: \"kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.583092 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8lrfw"] Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.679846 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.679996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwjfh\" (UniqueName: \"kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.680039 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:33 crc kubenswrapper[4869]: E0218 06:04:33.680231 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:04:33 crc kubenswrapper[4869]: E0218 06:04:33.680247 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:04:33 crc kubenswrapper[4869]: E0218 06:04:33.680302 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift podName:253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:37.680282481 +0000 UTC m=+974.849370733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift") pod "swift-storage-0" (UID: "253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624") : configmap "swift-ring-files" not found Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.681465 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.701846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwjfh\" (UniqueName: \"kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh\") pod \"root-account-create-update-s9bpj\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.774291 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.784814 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vc4l8"] Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.785879 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.797984 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.798228 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.798395 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.801989 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vc4l8"] Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.882649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.882927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkww\" (UniqueName: \"kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.882949 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.882982 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.883016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.883037 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.883059 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.908555 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vsntf" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.928609 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5kfj\" (UniqueName: \"kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj\") pod \"1adc4451-3322-4686-aeb5-ea4a457724a4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984555 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bs9n\" (UniqueName: \"kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n\") pod \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984580 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts\") pod \"1adc4451-3322-4686-aeb5-ea4a457724a4\" (UID: \"1adc4451-3322-4686-aeb5-ea4a457724a4\") " Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984668 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts\") pod \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\" (UID: \"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e\") " Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984935 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984964 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkww\" (UniqueName: \"kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.984984 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985019 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985052 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985075 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985290 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" (UID: "f8bd308c-0bb8-49f6-a629-8d24b1c5d55e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985716 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.985640 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1adc4451-3322-4686-aeb5-ea4a457724a4" (UID: "1adc4451-3322-4686-aeb5-ea4a457724a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.986043 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc4451-3322-4686-aeb5-ea4a457724a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.986068 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.986112 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.986164 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.986197 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.989661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.989855 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n" (OuterVolumeSpecName: "kube-api-access-6bs9n") pod "f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" (UID: "f8bd308c-0bb8-49f6-a629-8d24b1c5d55e"). InnerVolumeSpecName "kube-api-access-6bs9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.989921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj" (OuterVolumeSpecName: "kube-api-access-r5kfj") pod "1adc4451-3322-4686-aeb5-ea4a457724a4" (UID: "1adc4451-3322-4686-aeb5-ea4a457724a4"). InnerVolumeSpecName "kube-api-access-r5kfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.994492 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:33 crc kubenswrapper[4869]: I0218 06:04:33.995481 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.001425 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkww\" (UniqueName: \"kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww\") pod \"swift-ring-rebalance-vc4l8\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.087947 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5kfj\" (UniqueName: \"kubernetes.io/projected/1adc4451-3322-4686-aeb5-ea4a457724a4-kube-api-access-r5kfj\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.088002 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bs9n\" (UniqueName: \"kubernetes.io/projected/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e-kube-api-access-6bs9n\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.211524 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s9bpj"] Feb 18 06:04:34 crc kubenswrapper[4869]: W0218 06:04:34.223646 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f44589_815e_4617_ad29_beba672ea7b3.slice/crio-33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954 WatchSource:0}: Error finding container 33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954: Status 404 returned error can't find the container with id 33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954 Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.227942 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.408864 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sr5s4" event={"ID":"1adc4451-3322-4686-aeb5-ea4a457724a4","Type":"ContainerDied","Data":"51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638"} Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.409212 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c8d4ca5a4138a846aae7251ea5b8c7358a57fa1939c6cd2c5eaf185e0ea638" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.408924 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sr5s4" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.412277 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vsntf" event={"ID":"f8bd308c-0bb8-49f6-a629-8d24b1c5d55e","Type":"ContainerDied","Data":"727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1"} Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.412341 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727389908c8627905e3c61b8f063f2bb7a8b072611390d0d593e69ccd3293bc1" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.412424 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vsntf" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.413767 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s9bpj" event={"ID":"c9f44589-815e-4617-ad29-beba672ea7b3","Type":"ContainerStarted","Data":"33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954"} Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.688049 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vc4l8"] Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.787028 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.908427 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhxsb\" (UniqueName: \"kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb\") pod \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.908883 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts\") pod \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\" (UID: \"a2ae3a7e-91a4-4a0f-9537-d0438699a82f\") " Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.909651 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2ae3a7e-91a4-4a0f-9537-d0438699a82f" (UID: "a2ae3a7e-91a4-4a0f-9537-d0438699a82f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:34 crc kubenswrapper[4869]: I0218 06:04:34.914615 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb" (OuterVolumeSpecName: "kube-api-access-dhxsb") pod "a2ae3a7e-91a4-4a0f-9537-d0438699a82f" (UID: "a2ae3a7e-91a4-4a0f-9537-d0438699a82f"). InnerVolumeSpecName "kube-api-access-dhxsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.010886 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhxsb\" (UniqueName: \"kubernetes.io/projected/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-kube-api-access-dhxsb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.010941 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2ae3a7e-91a4-4a0f-9537-d0438699a82f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.027786 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.033682 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.038628 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.111996 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkmr6\" (UniqueName: \"kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6\") pod \"2268e717-5605-4efd-aed0-0323235e9211\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.112086 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts\") pod \"aaf77221-4fa8-4a41-b019-042eb77a2553\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.112284 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwgm9\" (UniqueName: \"kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9\") pod \"aaf77221-4fa8-4a41-b019-042eb77a2553\" (UID: \"aaf77221-4fa8-4a41-b019-042eb77a2553\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.112936 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts\") pod \"2268e717-5605-4efd-aed0-0323235e9211\" (UID: \"2268e717-5605-4efd-aed0-0323235e9211\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113048 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts\") pod \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113109 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b69b6\" (UniqueName: \"kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6\") pod \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\" (UID: \"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6\") " Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113158 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aaf77221-4fa8-4a41-b019-042eb77a2553" (UID: "aaf77221-4fa8-4a41-b019-042eb77a2553"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113423 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2268e717-5605-4efd-aed0-0323235e9211" (UID: "2268e717-5605-4efd-aed0-0323235e9211"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113673 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" (UID: "d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113881 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2268e717-5605-4efd-aed0-0323235e9211-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113908 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.113922 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf77221-4fa8-4a41-b019-042eb77a2553-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.118499 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9" (OuterVolumeSpecName: "kube-api-access-cwgm9") pod "aaf77221-4fa8-4a41-b019-042eb77a2553" (UID: "aaf77221-4fa8-4a41-b019-042eb77a2553"). InnerVolumeSpecName "kube-api-access-cwgm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.118625 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6" (OuterVolumeSpecName: "kube-api-access-b69b6") pod "d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" (UID: "d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6"). InnerVolumeSpecName "kube-api-access-b69b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.130284 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6" (OuterVolumeSpecName: "kube-api-access-pkmr6") pod "2268e717-5605-4efd-aed0-0323235e9211" (UID: "2268e717-5605-4efd-aed0-0323235e9211"). InnerVolumeSpecName "kube-api-access-pkmr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.215635 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwgm9\" (UniqueName: \"kubernetes.io/projected/aaf77221-4fa8-4a41-b019-042eb77a2553-kube-api-access-cwgm9\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.215682 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b69b6\" (UniqueName: \"kubernetes.io/projected/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6-kube-api-access-b69b6\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.215695 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkmr6\" (UniqueName: \"kubernetes.io/projected/2268e717-5605-4efd-aed0-0323235e9211-kube-api-access-pkmr6\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.428419 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d5b-account-create-update-7v5gt" event={"ID":"a2ae3a7e-91a4-4a0f-9537-d0438699a82f","Type":"ContainerDied","Data":"3a42b6c917e0e474cae8fda604821a88ef3e65e17db6b67cbf3df4e0b351920c"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.428479 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a42b6c917e0e474cae8fda604821a88ef3e65e17db6b67cbf3df4e0b351920c" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.428441 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d5b-account-create-update-7v5gt" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.430695 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a85-account-create-update-tbcd2" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.430709 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a85-account-create-update-tbcd2" event={"ID":"d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6","Type":"ContainerDied","Data":"832ea9537194d4e26a69602738d9f3e26e1ae7cb22b0a95d539b36c5578d1a46"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.430763 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832ea9537194d4e26a69602738d9f3e26e1ae7cb22b0a95d539b36c5578d1a46" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.432546 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-143a-account-create-update-n6nb8" event={"ID":"aaf77221-4fa8-4a41-b019-042eb77a2553","Type":"ContainerDied","Data":"bdec1df4e343faf32dc0d108e22819e5d4c58f6daee695fa5d08a4afd61a0346"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.432562 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-143a-account-create-update-n6nb8" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.432566 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdec1df4e343faf32dc0d108e22819e5d4c58f6daee695fa5d08a4afd61a0346" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.434668 4869 generic.go:334] "Generic (PLEG): container finished" podID="c9f44589-815e-4617-ad29-beba672ea7b3" containerID="a8b1294c2ad83064d6faed9445497098a3271d80ca3d87ddb2c0a98238751b1c" exitCode=0 Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.434721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s9bpj" event={"ID":"c9f44589-815e-4617-ad29-beba672ea7b3","Type":"ContainerDied","Data":"a8b1294c2ad83064d6faed9445497098a3271d80ca3d87ddb2c0a98238751b1c"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.436095 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vc4l8" event={"ID":"0cb2f895-3d57-468e-8197-636fcc33afe4","Type":"ContainerStarted","Data":"a9aa216f67ed11e47f9208aa6f5b28a4d5eeda1b6df0f90a9cab135e7f11ad45"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.437647 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhdcd" event={"ID":"2268e717-5605-4efd-aed0-0323235e9211","Type":"ContainerDied","Data":"b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33"} Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.437671 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80ed7fb650c5f05bdd521ae8a7c91f2dd38fccc6c921c5da5c64b0fce09fa33" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.437700 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhdcd" Feb 18 06:04:35 crc kubenswrapper[4869]: I0218 06:04:35.487223 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" path="/var/lib/kubelet/pods/6cb5aea3-1e80-4cb0-bae7-7d9be7a77841/volumes" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.942598 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-g2c4c"] Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943477 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943494 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943515 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf77221-4fa8-4a41-b019-042eb77a2553" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943520 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf77221-4fa8-4a41-b019-042eb77a2553" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943531 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943540 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943563 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268e717-5605-4efd-aed0-0323235e9211" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943571 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268e717-5605-4efd-aed0-0323235e9211" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943583 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adc4451-3322-4686-aeb5-ea4a457724a4" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943590 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adc4451-3322-4686-aeb5-ea4a457724a4" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: E0218 06:04:36.943606 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ae3a7e-91a4-4a0f-9537-d0438699a82f" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943614 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ae3a7e-91a4-4a0f-9537-d0438699a82f" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943769 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2268e717-5605-4efd-aed0-0323235e9211" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943784 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ae3a7e-91a4-4a0f-9537-d0438699a82f" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943792 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943800 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf77221-4fa8-4a41-b019-042eb77a2553" containerName="mariadb-account-create-update" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943810 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adc4451-3322-4686-aeb5-ea4a457724a4" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.943818 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" containerName="mariadb-database-create" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.944322 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.951423 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.951774 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5sd9x" Feb 18 06:04:36 crc kubenswrapper[4869]: I0218 06:04:36.969397 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g2c4c"] Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.056855 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2t5f\" (UniqueName: \"kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.056934 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.056962 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.057028 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.158588 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.158649 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.158714 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.158774 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2t5f\" (UniqueName: \"kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.173408 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.173843 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.174161 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.174518 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2t5f\" (UniqueName: \"kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f\") pod \"glance-db-sync-g2c4c\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.184358 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-8lrfw" podUID="6cb5aea3-1e80-4cb0-bae7-7d9be7a77841" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.267560 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g2c4c" Feb 18 06:04:37 crc kubenswrapper[4869]: I0218 06:04:37.769971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:37 crc kubenswrapper[4869]: E0218 06:04:37.770210 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:04:37 crc kubenswrapper[4869]: E0218 06:04:37.770252 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:04:37 crc kubenswrapper[4869]: E0218 06:04:37.770331 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift podName:253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624 nodeName:}" failed. No retries permitted until 2026-02-18 06:04:45.770304648 +0000 UTC m=+982.939392900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift") pod "swift-storage-0" (UID: "253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624") : configmap "swift-ring-files" not found Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.319026 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.380022 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwjfh\" (UniqueName: \"kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh\") pod \"c9f44589-815e-4617-ad29-beba672ea7b3\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.380221 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts\") pod \"c9f44589-815e-4617-ad29-beba672ea7b3\" (UID: \"c9f44589-815e-4617-ad29-beba672ea7b3\") " Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.381250 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9f44589-815e-4617-ad29-beba672ea7b3" (UID: "c9f44589-815e-4617-ad29-beba672ea7b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.385997 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh" (OuterVolumeSpecName: "kube-api-access-vwjfh") pod "c9f44589-815e-4617-ad29-beba672ea7b3" (UID: "c9f44589-815e-4617-ad29-beba672ea7b3"). InnerVolumeSpecName "kube-api-access-vwjfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.462124 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s9bpj" event={"ID":"c9f44589-815e-4617-ad29-beba672ea7b3","Type":"ContainerDied","Data":"33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954"} Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.462181 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a35253d9ab098ab9944dbf978f8a963c64bca2e2b2c32ec97d625ac7b00954" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.462273 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s9bpj" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.481479 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f44589-815e-4617-ad29-beba672ea7b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.481739 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwjfh\" (UniqueName: \"kubernetes.io/projected/c9f44589-815e-4617-ad29-beba672ea7b3-kube-api-access-vwjfh\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:38 crc kubenswrapper[4869]: I0218 06:04:38.775065 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g2c4c"] Feb 18 06:04:38 crc kubenswrapper[4869]: W0218 06:04:38.777137 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aed2e80_e13f_49c6_8570_f6384bc1a079.slice/crio-f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13 WatchSource:0}: Error finding container f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13: Status 404 returned error can't find the container with id f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13 Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.191955 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.332654 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.333124 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="dnsmasq-dns" containerID="cri-o://aa23597f51eb49ef7f9286909c936650102b08b9ee0c4f8d5a03eca17ea72431" gracePeriod=10 Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.482481 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerID="aa23597f51eb49ef7f9286909c936650102b08b9ee0c4f8d5a03eca17ea72431" exitCode=0 Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.484285 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vc4l8" event={"ID":"0cb2f895-3d57-468e-8197-636fcc33afe4","Type":"ContainerStarted","Data":"0dbf40fdac3ba3b67c2fef6f3ec2d3d742be8b2640984b45042e85e4935fece2"} Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.484328 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" event={"ID":"bb0523a5-e2bc-45f8-aff2-de770dcf88b2","Type":"ContainerDied","Data":"aa23597f51eb49ef7f9286909c936650102b08b9ee0c4f8d5a03eca17ea72431"} Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.484345 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g2c4c" event={"ID":"7aed2e80-e13f-49c6-8570-f6384bc1a079","Type":"ContainerStarted","Data":"f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13"} Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.494123 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vc4l8" podStartSLOduration=2.888892723 podStartE2EDuration="6.494109381s" podCreationTimestamp="2026-02-18 06:04:33 +0000 UTC" firstStartedPulling="2026-02-18 06:04:34.722812242 +0000 UTC m=+971.891900494" lastFinishedPulling="2026-02-18 06:04:38.32802892 +0000 UTC m=+975.497117152" observedRunningTime="2026-02-18 06:04:39.491329025 +0000 UTC m=+976.660417257" watchObservedRunningTime="2026-02-18 06:04:39.494109381 +0000 UTC m=+976.663197613" Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.780582 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-s9bpj"] Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.785993 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-s9bpj"] Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.862616 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.943861 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config\") pod \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.943973 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc\") pod \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.944028 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4rh\" (UniqueName: \"kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh\") pod \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\" (UID: \"bb0523a5-e2bc-45f8-aff2-de770dcf88b2\") " Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.951217 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh" (OuterVolumeSpecName: "kube-api-access-hn4rh") pod "bb0523a5-e2bc-45f8-aff2-de770dcf88b2" (UID: "bb0523a5-e2bc-45f8-aff2-de770dcf88b2"). InnerVolumeSpecName "kube-api-access-hn4rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.989659 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb0523a5-e2bc-45f8-aff2-de770dcf88b2" (UID: "bb0523a5-e2bc-45f8-aff2-de770dcf88b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:39 crc kubenswrapper[4869]: I0218 06:04:39.990244 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config" (OuterVolumeSpecName: "config") pod "bb0523a5-e2bc-45f8-aff2-de770dcf88b2" (UID: "bb0523a5-e2bc-45f8-aff2-de770dcf88b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.045831 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.046126 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.046137 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4rh\" (UniqueName: \"kubernetes.io/projected/bb0523a5-e2bc-45f8-aff2-de770dcf88b2-kube-api-access-hn4rh\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.502667 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.503180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tlzx5" event={"ID":"bb0523a5-e2bc-45f8-aff2-de770dcf88b2","Type":"ContainerDied","Data":"e8c539975cfd907eef7330a42d2e72e3e1c40379c945001303cefd2c7c1ed78f"} Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.503888 4869 scope.go:117] "RemoveContainer" containerID="aa23597f51eb49ef7f9286909c936650102b08b9ee0c4f8d5a03eca17ea72431" Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.566121 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.583782 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tlzx5"] Feb 18 06:04:40 crc kubenswrapper[4869]: I0218 06:04:40.590067 4869 scope.go:117] "RemoveContainer" containerID="85bdfad990098054cacb33bd58daf0b06f02de291e8abaf24581148c7961494d" Feb 18 06:04:41 crc kubenswrapper[4869]: I0218 06:04:41.483735 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" path="/var/lib/kubelet/pods/bb0523a5-e2bc-45f8-aff2-de770dcf88b2/volumes" Feb 18 06:04:41 crc kubenswrapper[4869]: I0218 06:04:41.484292 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f44589-815e-4617-ad29-beba672ea7b3" path="/var/lib/kubelet/pods/c9f44589-815e-4617-ad29-beba672ea7b3/volumes" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538079 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lg2gb"] Feb 18 06:04:43 crc kubenswrapper[4869]: E0218 06:04:43.538462 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="init" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538476 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="init" Feb 18 06:04:43 crc kubenswrapper[4869]: E0218 06:04:43.538498 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f44589-815e-4617-ad29-beba672ea7b3" containerName="mariadb-account-create-update" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538504 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f44589-815e-4617-ad29-beba672ea7b3" containerName="mariadb-account-create-update" Feb 18 06:04:43 crc kubenswrapper[4869]: E0218 06:04:43.538516 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="dnsmasq-dns" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538523 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="dnsmasq-dns" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538728 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0523a5-e2bc-45f8-aff2-de770dcf88b2" containerName="dnsmasq-dns" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.538769 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f44589-815e-4617-ad29-beba672ea7b3" containerName="mariadb-account-create-update" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.539301 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.544092 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.551273 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lg2gb"] Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.709370 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zzsk\" (UniqueName: \"kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.709652 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.812533 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zzsk\" (UniqueName: \"kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.812636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.813807 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.844038 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zzsk\" (UniqueName: \"kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk\") pod \"root-account-create-update-lg2gb\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:43 crc kubenswrapper[4869]: I0218 06:04:43.907622 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:44 crc kubenswrapper[4869]: I0218 06:04:44.074496 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 18 06:04:44 crc kubenswrapper[4869]: I0218 06:04:44.358014 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lg2gb"] Feb 18 06:04:45 crc kubenswrapper[4869]: I0218 06:04:45.576728 4869 generic.go:334] "Generic (PLEG): container finished" podID="0cb2f895-3d57-468e-8197-636fcc33afe4" containerID="0dbf40fdac3ba3b67c2fef6f3ec2d3d742be8b2640984b45042e85e4935fece2" exitCode=0 Feb 18 06:04:45 crc kubenswrapper[4869]: I0218 06:04:45.576781 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vc4l8" event={"ID":"0cb2f895-3d57-468e-8197-636fcc33afe4","Type":"ContainerDied","Data":"0dbf40fdac3ba3b67c2fef6f3ec2d3d742be8b2640984b45042e85e4935fece2"} Feb 18 06:04:45 crc kubenswrapper[4869]: I0218 06:04:45.850659 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:45 crc kubenswrapper[4869]: I0218 06:04:45.863673 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624-etc-swift\") pod \"swift-storage-0\" (UID: \"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624\") " pod="openstack/swift-storage-0" Feb 18 06:04:46 crc kubenswrapper[4869]: I0218 06:04:46.150592 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 06:04:46 crc kubenswrapper[4869]: I0218 06:04:46.674387 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6zzxt" podUID="0b85434e-56f8-4cab-91a5-8cf0ea0356fc" containerName="ovn-controller" probeResult="failure" output=< Feb 18 06:04:46 crc kubenswrapper[4869]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 06:04:46 crc kubenswrapper[4869]: > Feb 18 06:04:46 crc kubenswrapper[4869]: I0218 06:04:46.715793 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:49 crc kubenswrapper[4869]: I0218 06:04:49.603723 4869 generic.go:334] "Generic (PLEG): container finished" podID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerID="1683808ddc02a9672a20148fe3fe1be88215576ea37bf98a1fdafb9845128cd9" exitCode=0 Feb 18 06:04:49 crc kubenswrapper[4869]: I0218 06:04:49.603819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerDied","Data":"1683808ddc02a9672a20148fe3fe1be88215576ea37bf98a1fdafb9845128cd9"} Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.537824 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.636652 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vc4l8" event={"ID":"0cb2f895-3d57-468e-8197-636fcc33afe4","Type":"ContainerDied","Data":"a9aa216f67ed11e47f9208aa6f5b28a4d5eeda1b6df0f90a9cab135e7f11ad45"} Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.636702 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9aa216f67ed11e47f9208aa6f5b28a4d5eeda1b6df0f90a9cab135e7f11ad45" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.636809 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vc4l8" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.638649 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lg2gb" event={"ID":"83ca03e8-e3ed-4a29-a58f-feaf6f76292b","Type":"ContainerStarted","Data":"45b3cf6ef8f54d5f0719bad3e36ddcdb89085e6d2f09ff43f1839d64ff23a030"} Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.668929 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.669867 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.669904 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.669994 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.670056 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shkww\" (UniqueName: \"kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.670076 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.670155 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift\") pod \"0cb2f895-3d57-468e-8197-636fcc33afe4\" (UID: \"0cb2f895-3d57-468e-8197-636fcc33afe4\") " Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.671200 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.672001 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.690032 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww" (OuterVolumeSpecName: "kube-api-access-shkww") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "kube-api-access-shkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.702191 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.723128 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6zzxt" podUID="0b85434e-56f8-4cab-91a5-8cf0ea0356fc" containerName="ovn-controller" probeResult="failure" output=< Feb 18 06:04:51 crc kubenswrapper[4869]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 06:04:51 crc kubenswrapper[4869]: > Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.723961 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.741337 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.755530 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts" (OuterVolumeSpecName: "scripts") pod "0cb2f895-3d57-468e-8197-636fcc33afe4" (UID: "0cb2f895-3d57-468e-8197-636fcc33afe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773859 4869 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773895 4869 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773905 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773920 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shkww\" (UniqueName: \"kubernetes.io/projected/0cb2f895-3d57-468e-8197-636fcc33afe4-kube-api-access-shkww\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773932 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb2f895-3d57-468e-8197-636fcc33afe4-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773941 4869 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0cb2f895-3d57-468e-8197-636fcc33afe4-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.773950 4869 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0cb2f895-3d57-468e-8197-636fcc33afe4-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:51 crc kubenswrapper[4869]: I0218 06:04:51.807290 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5czp2" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.074035 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6zzxt-config-hkb9k"] Feb 18 06:04:52 crc kubenswrapper[4869]: E0218 06:04:52.084958 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb2f895-3d57-468e-8197-636fcc33afe4" containerName="swift-ring-rebalance" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.084993 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb2f895-3d57-468e-8197-636fcc33afe4" containerName="swift-ring-rebalance" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.085325 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb2f895-3d57-468e-8197-636fcc33afe4" containerName="swift-ring-rebalance" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.085852 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.090028 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.099236 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6zzxt-config-hkb9k"] Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184704 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184793 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184819 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184860 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184914 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9428\" (UniqueName: \"kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.184946 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.195300 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.286790 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.286942 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287019 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287044 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287104 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287170 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9428\" (UniqueName: \"kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287668 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.287908 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.288289 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.289838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.306996 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9428\" (UniqueName: \"kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428\") pod \"ovn-controller-6zzxt-config-hkb9k\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.433518 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.649684 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g2c4c" event={"ID":"7aed2e80-e13f-49c6-8570-f6384bc1a079","Type":"ContainerStarted","Data":"0a785e08508c261bd7ed591066bda750326d3b253997986524cf3efdd49aacd4"} Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.651701 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"13e8514dc78f7a2390f7da9342d3ccc672307a0377dc0c8ab138ef247df2f8e7"} Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.654582 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerStarted","Data":"9d2471433996dbecba9e05447defe62056ea851a4aa9534dca6057e96725ce44"} Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.654790 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.656590 4869 generic.go:334] "Generic (PLEG): container finished" podID="83ca03e8-e3ed-4a29-a58f-feaf6f76292b" containerID="58e39ac6f1f159b42d13f6561c2908dd16b19cd0850d79e5527a819c6755a9b7" exitCode=0 Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.656621 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lg2gb" event={"ID":"83ca03e8-e3ed-4a29-a58f-feaf6f76292b","Type":"ContainerDied","Data":"58e39ac6f1f159b42d13f6561c2908dd16b19cd0850d79e5527a819c6755a9b7"} Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.670386 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-g2c4c" podStartSLOduration=3.897303539 podStartE2EDuration="16.670366321s" podCreationTimestamp="2026-02-18 06:04:36 +0000 UTC" firstStartedPulling="2026-02-18 06:04:38.779333275 +0000 UTC m=+975.948421507" lastFinishedPulling="2026-02-18 06:04:51.552396057 +0000 UTC m=+988.721484289" observedRunningTime="2026-02-18 06:04:52.66701432 +0000 UTC m=+989.836102552" watchObservedRunningTime="2026-02-18 06:04:52.670366321 +0000 UTC m=+989.839454553" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.704466 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.995422513 podStartE2EDuration="1m0.70444574s" podCreationTimestamp="2026-02-18 06:03:52 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.003080446 +0000 UTC m=+943.172168678" lastFinishedPulling="2026-02-18 06:04:14.712103673 +0000 UTC m=+951.881191905" observedRunningTime="2026-02-18 06:04:52.695883121 +0000 UTC m=+989.864971353" watchObservedRunningTime="2026-02-18 06:04:52.70444574 +0000 UTC m=+989.873533972" Feb 18 06:04:52 crc kubenswrapper[4869]: I0218 06:04:52.907206 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6zzxt-config-hkb9k"] Feb 18 06:04:53 crc kubenswrapper[4869]: I0218 06:04:53.665373 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6zzxt-config-hkb9k" event={"ID":"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5","Type":"ContainerStarted","Data":"12560d9c5c3f2fd14f7c7447a545b0f0208f89787c6af42371191217bd5460cf"} Feb 18 06:04:53 crc kubenswrapper[4869]: I0218 06:04:53.666327 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6zzxt-config-hkb9k" event={"ID":"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5","Type":"ContainerStarted","Data":"632909331a8877fd84c545c2920d36da0c47cf2c7990ae02c38ee753be2bedce"} Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.318927 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.350641 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6zzxt-config-hkb9k" podStartSLOduration=2.350623057 podStartE2EDuration="2.350623057s" podCreationTimestamp="2026-02-18 06:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:53.697669242 +0000 UTC m=+990.866757474" watchObservedRunningTime="2026-02-18 06:04:54.350623057 +0000 UTC m=+991.519711289" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.515239 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts\") pod \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.515732 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zzsk\" (UniqueName: \"kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk\") pod \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\" (UID: \"83ca03e8-e3ed-4a29-a58f-feaf6f76292b\") " Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.516041 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83ca03e8-e3ed-4a29-a58f-feaf6f76292b" (UID: "83ca03e8-e3ed-4a29-a58f-feaf6f76292b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.520211 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk" (OuterVolumeSpecName: "kube-api-access-5zzsk") pod "83ca03e8-e3ed-4a29-a58f-feaf6f76292b" (UID: "83ca03e8-e3ed-4a29-a58f-feaf6f76292b"). InnerVolumeSpecName "kube-api-access-5zzsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.617887 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.617933 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zzsk\" (UniqueName: \"kubernetes.io/projected/83ca03e8-e3ed-4a29-a58f-feaf6f76292b-kube-api-access-5zzsk\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.682495 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"dfa9e7d0f85e25c28bfbd5428a25bf48c49008113af2680b061ecf1953bdfc6c"} Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.686096 4869 generic.go:334] "Generic (PLEG): container finished" podID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerID="00eeaaab546dfd1b221345a3f48c9875681142cf110a173f4094afe2a48c845c" exitCode=0 Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.686215 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerDied","Data":"00eeaaab546dfd1b221345a3f48c9875681142cf110a173f4094afe2a48c845c"} Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.690192 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lg2gb" event={"ID":"83ca03e8-e3ed-4a29-a58f-feaf6f76292b","Type":"ContainerDied","Data":"45b3cf6ef8f54d5f0719bad3e36ddcdb89085e6d2f09ff43f1839d64ff23a030"} Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.690247 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b3cf6ef8f54d5f0719bad3e36ddcdb89085e6d2f09ff43f1839d64ff23a030" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.690318 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lg2gb" Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.692473 4869 generic.go:334] "Generic (PLEG): container finished" podID="36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" containerID="12560d9c5c3f2fd14f7c7447a545b0f0208f89787c6af42371191217bd5460cf" exitCode=0 Feb 18 06:04:54 crc kubenswrapper[4869]: I0218 06:04:54.692513 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6zzxt-config-hkb9k" event={"ID":"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5","Type":"ContainerDied","Data":"12560d9c5c3f2fd14f7c7447a545b0f0208f89787c6af42371191217bd5460cf"} Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.702797 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"49c9e52c0a1e713f07f22cdc5ff87b89bca5886bd28672b65fb31cbbedc7f0e0"} Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.703063 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"28c5ec176a31643d31e7949f110d59ce6be25f816a6b798caf96936d7b6894e0"} Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.703072 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"5ee76933c8b65a19c08c7891f18c2b99f7d749cc82c2abd5e9ad4500bc667a3a"} Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.704843 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerStarted","Data":"bd73329caf191e3e88645445f11a08e2209ad944a0492b42fb994fe2228de34e"} Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.705767 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 06:04:55 crc kubenswrapper[4869]: I0218 06:04:55.732536 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.131796687 podStartE2EDuration="1m4.732507463s" podCreationTimestamp="2026-02-18 06:03:51 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.031873655 +0000 UTC m=+943.200961887" lastFinishedPulling="2026-02-18 06:04:14.632584411 +0000 UTC m=+951.801672663" observedRunningTime="2026-02-18 06:04:55.726033956 +0000 UTC m=+992.895122188" watchObservedRunningTime="2026-02-18 06:04:55.732507463 +0000 UTC m=+992.901595695" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.082147 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260344 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260412 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260468 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260513 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260546 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260561 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9428\" (UniqueName: \"kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428\") pod \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\" (UID: \"36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5\") " Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260600 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run" (OuterVolumeSpecName: "var-run") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260652 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260914 4869 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260933 4869 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.260942 4869 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.261545 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.261707 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts" (OuterVolumeSpecName: "scripts") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.281111 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428" (OuterVolumeSpecName: "kube-api-access-k9428") pod "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" (UID: "36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5"). InnerVolumeSpecName "kube-api-access-k9428". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.362292 4869 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.362327 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.362338 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9428\" (UniqueName: \"kubernetes.io/projected/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5-kube-api-access-k9428\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.668912 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6zzxt-config-hkb9k"] Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.679884 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6zzxt-config-hkb9k"] Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.689402 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6zzxt" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.745066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"19d661b2ddd8f78ee28c68c447e6a5ba0f9aab3191b7a23f277fc4441fb91160"} Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.746764 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6zzxt-config-hkb9k" Feb 18 06:04:56 crc kubenswrapper[4869]: I0218 06:04:56.752184 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="632909331a8877fd84c545c2920d36da0c47cf2c7990ae02c38ee753be2bedce" Feb 18 06:04:57 crc kubenswrapper[4869]: I0218 06:04:57.480933 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" path="/var/lib/kubelet/pods/36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5/volumes" Feb 18 06:04:57 crc kubenswrapper[4869]: I0218 06:04:57.757766 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"118976aa861a34c17ddbefc1d9e3163d8150cec91f38f775899a67562f62508d"} Feb 18 06:04:57 crc kubenswrapper[4869]: I0218 06:04:57.757819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"fc55641ff36221f5d3f71882c1fbc054ee4adae94059920c1de7945142ca658a"} Feb 18 06:04:57 crc kubenswrapper[4869]: I0218 06:04:57.757833 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"bb4388b5c99bf12cd6413d2daaaa5b6ddc811fa594e5f9de7d3c8562ac9fa7e7"} Feb 18 06:04:59 crc kubenswrapper[4869]: I0218 06:04:59.800221 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lg2gb"] Feb 18 06:04:59 crc kubenswrapper[4869]: I0218 06:04:59.806601 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lg2gb"] Feb 18 06:05:01 crc kubenswrapper[4869]: I0218 06:05:01.488491 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ca03e8-e3ed-4a29-a58f-feaf6f76292b" path="/var/lib/kubelet/pods/83ca03e8-e3ed-4a29-a58f-feaf6f76292b/volumes" Feb 18 06:05:01 crc kubenswrapper[4869]: I0218 06:05:01.798175 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"66d4113070ad34f6fc3ecc5bcf2b76fd12ddc7e5066183b384ee4192f6a8c935"} Feb 18 06:05:02 crc kubenswrapper[4869]: I0218 06:05:02.809577 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"b58b61aa87712f17f74145be885f890c6bf970caff20912c57db3b24f2071bc9"} Feb 18 06:05:02 crc kubenswrapper[4869]: I0218 06:05:02.810124 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"e979feea061aec9e6030f73abb5e49ba17ac53ac66a00c3f7fe6f2401376c6ad"} Feb 18 06:05:02 crc kubenswrapper[4869]: I0218 06:05:02.810136 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"76faffd94159dff37e30f0eaca0573905aa6964e4b1b71e8d56667097990e3e2"} Feb 18 06:05:02 crc kubenswrapper[4869]: I0218 06:05:02.810145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"84027822c8f8fd84df4ac6fd6a4c444bd28aedde88211fd5e28c5c3a49d47e5e"} Feb 18 06:05:03 crc kubenswrapper[4869]: I0218 06:05:03.653044 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:05:03 crc kubenswrapper[4869]: I0218 06:05:03.829044 4869 generic.go:334] "Generic (PLEG): container finished" podID="7aed2e80-e13f-49c6-8570-f6384bc1a079" containerID="0a785e08508c261bd7ed591066bda750326d3b253997986524cf3efdd49aacd4" exitCode=0 Feb 18 06:05:03 crc kubenswrapper[4869]: I0218 06:05:03.829101 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g2c4c" event={"ID":"7aed2e80-e13f-49c6-8570-f6384bc1a079","Type":"ContainerDied","Data":"0a785e08508c261bd7ed591066bda750326d3b253997986524cf3efdd49aacd4"} Feb 18 06:05:03 crc kubenswrapper[4869]: I0218 06:05:03.848630 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"ea6b09ceede04f24b87c6317c8252b9d533b0d108eeda20d8cc3f7a63b521b5b"} Feb 18 06:05:03 crc kubenswrapper[4869]: I0218 06:05:03.848695 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624","Type":"ContainerStarted","Data":"c5e279da93d3d783271fd141cfdb8b16b95688c616b1e1970f3d22123808f836"} Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.198365 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=26.874963497 podStartE2EDuration="36.198342722s" podCreationTimestamp="2026-02-18 06:04:28 +0000 UTC" firstStartedPulling="2026-02-18 06:04:52.208621771 +0000 UTC m=+989.377710003" lastFinishedPulling="2026-02-18 06:05:01.532000966 +0000 UTC m=+998.701089228" observedRunningTime="2026-02-18 06:05:03.896202281 +0000 UTC m=+1001.065290513" watchObservedRunningTime="2026-02-18 06:05:04.198342722 +0000 UTC m=+1001.367430974" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.203654 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:04 crc kubenswrapper[4869]: E0218 06:05:04.203986 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ca03e8-e3ed-4a29-a58f-feaf6f76292b" containerName="mariadb-account-create-update" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.204004 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ca03e8-e3ed-4a29-a58f-feaf6f76292b" containerName="mariadb-account-create-update" Feb 18 06:05:04 crc kubenswrapper[4869]: E0218 06:05:04.204013 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" containerName="ovn-config" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.204019 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" containerName="ovn-config" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.204220 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ca03e8-e3ed-4a29-a58f-feaf6f76292b" containerName="mariadb-account-create-update" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.204240 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d2d6d6-3440-4353-91c5-4a5ab3c1b3f5" containerName="ovn-config" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.205066 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.207358 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.228413 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301405 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301454 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301483 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301508 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbh5\" (UniqueName: \"kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301644 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.301724 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403497 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403581 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403608 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403634 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403658 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbh5\" (UniqueName: \"kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.403737 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.404672 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.404771 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.404803 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.404817 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.404887 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.429308 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbh5\" (UniqueName: \"kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5\") pod \"dnsmasq-dns-764c5664d7-vhlbk\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.519792 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.813164 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.826046 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n69mt"] Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.827325 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.830753 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.839586 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n69mt"] Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.883821 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" event={"ID":"841d160a-6d76-4977-9d93-ae2bc32e35a0","Type":"ContainerStarted","Data":"fcc4849bc471c0d2d9338bbd3fdc01e2fdf631f63baba7b282755b9156b68e4d"} Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.914802 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dxj2\" (UniqueName: \"kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:04 crc kubenswrapper[4869]: I0218 06:05:04.914858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.016910 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dxj2\" (UniqueName: \"kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.016961 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.017683 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.038822 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dxj2\" (UniqueName: \"kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2\") pod \"root-account-create-update-n69mt\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.201254 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.212602 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g2c4c" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.323663 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2t5f\" (UniqueName: \"kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f\") pod \"7aed2e80-e13f-49c6-8570-f6384bc1a079\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.324243 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data\") pod \"7aed2e80-e13f-49c6-8570-f6384bc1a079\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.324301 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data\") pod \"7aed2e80-e13f-49c6-8570-f6384bc1a079\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.324538 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle\") pod \"7aed2e80-e13f-49c6-8570-f6384bc1a079\" (UID: \"7aed2e80-e13f-49c6-8570-f6384bc1a079\") " Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.333072 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f" (OuterVolumeSpecName: "kube-api-access-m2t5f") pod "7aed2e80-e13f-49c6-8570-f6384bc1a079" (UID: "7aed2e80-e13f-49c6-8570-f6384bc1a079"). InnerVolumeSpecName "kube-api-access-m2t5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.335550 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7aed2e80-e13f-49c6-8570-f6384bc1a079" (UID: "7aed2e80-e13f-49c6-8570-f6384bc1a079"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.356480 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aed2e80-e13f-49c6-8570-f6384bc1a079" (UID: "7aed2e80-e13f-49c6-8570-f6384bc1a079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.375879 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data" (OuterVolumeSpecName: "config-data") pod "7aed2e80-e13f-49c6-8570-f6384bc1a079" (UID: "7aed2e80-e13f-49c6-8570-f6384bc1a079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.427848 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.427879 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.427889 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed2e80-e13f-49c6-8570-f6384bc1a079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.427897 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2t5f\" (UniqueName: \"kubernetes.io/projected/7aed2e80-e13f-49c6-8570-f6384bc1a079-kube-api-access-m2t5f\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.696489 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n69mt"] Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.923794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n69mt" event={"ID":"8dfb4ffc-626d-4998-8c3e-422f3f41ac00","Type":"ContainerStarted","Data":"2f697f4e5ca277b966d46a312c66ae3d1a691cf1bb98ed4d9e4692c1071426d7"} Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.923853 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n69mt" event={"ID":"8dfb4ffc-626d-4998-8c3e-422f3f41ac00","Type":"ContainerStarted","Data":"b186ba795edb52b2faa8d023905eff73b582aaae9b3842d17df596b61276c2af"} Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.929877 4869 generic.go:334] "Generic (PLEG): container finished" podID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerID="cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0" exitCode=0 Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.930684 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" event={"ID":"841d160a-6d76-4977-9d93-ae2bc32e35a0","Type":"ContainerDied","Data":"cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0"} Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.944514 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g2c4c" event={"ID":"7aed2e80-e13f-49c6-8570-f6384bc1a079","Type":"ContainerDied","Data":"f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13"} Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.944712 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g2c4c" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.944567 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33a5e846af9c9ce9c066a495fbccf68bb6e58c15988dd2271873f83bf1b3d13" Feb 18 06:05:05 crc kubenswrapper[4869]: I0218 06:05:05.950285 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-n69mt" podStartSLOduration=1.9502679889999999 podStartE2EDuration="1.950267989s" podCreationTimestamp="2026-02-18 06:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:05.944120449 +0000 UTC m=+1003.113208691" watchObservedRunningTime="2026-02-18 06:05:05.950267989 +0000 UTC m=+1003.119356221" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.392929 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.433988 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:06 crc kubenswrapper[4869]: E0218 06:05:06.434342 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aed2e80-e13f-49c6-8570-f6384bc1a079" containerName="glance-db-sync" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.434360 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aed2e80-e13f-49c6-8570-f6384bc1a079" containerName="glance-db-sync" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.434509 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aed2e80-e13f-49c6-8570-f6384bc1a079" containerName="glance-db-sync" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.441103 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.476108 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.554579 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.554894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.555027 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.555113 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.555192 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.555298 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgl24\" (UniqueName: \"kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656388 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656443 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656485 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656547 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgl24\" (UniqueName: \"kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656576 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.656614 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.657943 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.658057 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.658474 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.658683 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.659098 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.680294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgl24\" (UniqueName: \"kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24\") pod \"dnsmasq-dns-74f6bcbc87-g6q8b\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.805532 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.966932 4869 generic.go:334] "Generic (PLEG): container finished" podID="8dfb4ffc-626d-4998-8c3e-422f3f41ac00" containerID="2f697f4e5ca277b966d46a312c66ae3d1a691cf1bb98ed4d9e4692c1071426d7" exitCode=0 Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.967104 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n69mt" event={"ID":"8dfb4ffc-626d-4998-8c3e-422f3f41ac00","Type":"ContainerDied","Data":"2f697f4e5ca277b966d46a312c66ae3d1a691cf1bb98ed4d9e4692c1071426d7"} Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.969046 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" event={"ID":"841d160a-6d76-4977-9d93-ae2bc32e35a0","Type":"ContainerStarted","Data":"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082"} Feb 18 06:05:06 crc kubenswrapper[4869]: I0218 06:05:06.969704 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.017269 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" podStartSLOduration=3.017252214 podStartE2EDuration="3.017252214s" podCreationTimestamp="2026-02-18 06:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:07.009767833 +0000 UTC m=+1004.178856085" watchObservedRunningTime="2026-02-18 06:05:07.017252214 +0000 UTC m=+1004.186340446" Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.309907 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.980174 4869 generic.go:334] "Generic (PLEG): container finished" podID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerID="ce12f91d6c365f13573c7ef014d966a63cf08797d1bd54a54585de51bdc7653c" exitCode=0 Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.980798 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="dnsmasq-dns" containerID="cri-o://ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082" gracePeriod=10 Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.982044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" event={"ID":"73514d80-ca11-468b-a6a0-4c3fffff8fea","Type":"ContainerDied","Data":"ce12f91d6c365f13573c7ef014d966a63cf08797d1bd54a54585de51bdc7653c"} Feb 18 06:05:07 crc kubenswrapper[4869]: I0218 06:05:07.982116 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" event={"ID":"73514d80-ca11-468b-a6a0-4c3fffff8fea","Type":"ContainerStarted","Data":"5963833b063036660d5c36746557c78563fbf7abe2104a6b0b4be750f89976f9"} Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.332380 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.387932 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts\") pod \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.388016 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dxj2\" (UniqueName: \"kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2\") pod \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\" (UID: \"8dfb4ffc-626d-4998-8c3e-422f3f41ac00\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.389175 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dfb4ffc-626d-4998-8c3e-422f3f41ac00" (UID: "8dfb4ffc-626d-4998-8c3e-422f3f41ac00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.393893 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2" (OuterVolumeSpecName: "kube-api-access-9dxj2") pod "8dfb4ffc-626d-4998-8c3e-422f3f41ac00" (UID: "8dfb4ffc-626d-4998-8c3e-422f3f41ac00"). InnerVolumeSpecName "kube-api-access-9dxj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.401268 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.489540 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvbh5\" (UniqueName: \"kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.489626 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.489666 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.489729 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.489898 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.490003 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb\") pod \"841d160a-6d76-4977-9d93-ae2bc32e35a0\" (UID: \"841d160a-6d76-4977-9d93-ae2bc32e35a0\") " Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.490474 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.490498 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dxj2\" (UniqueName: \"kubernetes.io/projected/8dfb4ffc-626d-4998-8c3e-422f3f41ac00-kube-api-access-9dxj2\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.511609 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5" (OuterVolumeSpecName: "kube-api-access-wvbh5") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "kube-api-access-wvbh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.545507 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.549821 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config" (OuterVolumeSpecName: "config") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.553780 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.558227 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.564162 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "841d160a-6d76-4977-9d93-ae2bc32e35a0" (UID: "841d160a-6d76-4977-9d93-ae2bc32e35a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592031 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592070 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592084 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592093 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvbh5\" (UniqueName: \"kubernetes.io/projected/841d160a-6d76-4977-9d93-ae2bc32e35a0-kube-api-access-wvbh5\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592105 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.592115 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841d160a-6d76-4977-9d93-ae2bc32e35a0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.989761 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n69mt" event={"ID":"8dfb4ffc-626d-4998-8c3e-422f3f41ac00","Type":"ContainerDied","Data":"b186ba795edb52b2faa8d023905eff73b582aaae9b3842d17df596b61276c2af"} Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.989816 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b186ba795edb52b2faa8d023905eff73b582aaae9b3842d17df596b61276c2af" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.989787 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n69mt" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.991301 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" event={"ID":"73514d80-ca11-468b-a6a0-4c3fffff8fea","Type":"ContainerStarted","Data":"bdd6ea05c4e5389753594030a5481a76090d9fb7300755ec472f58e94a940d5a"} Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.991378 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.993328 4869 generic.go:334] "Generic (PLEG): container finished" podID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerID="ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082" exitCode=0 Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.993364 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" event={"ID":"841d160a-6d76-4977-9d93-ae2bc32e35a0","Type":"ContainerDied","Data":"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082"} Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.993386 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" event={"ID":"841d160a-6d76-4977-9d93-ae2bc32e35a0","Type":"ContainerDied","Data":"fcc4849bc471c0d2d9338bbd3fdc01e2fdf631f63baba7b282755b9156b68e4d"} Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.993403 4869 scope.go:117] "RemoveContainer" containerID="ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082" Feb 18 06:05:08 crc kubenswrapper[4869]: I0218 06:05:08.993406 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vhlbk" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.012945 4869 scope.go:117] "RemoveContainer" containerID="cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.014056 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podStartSLOduration=3.014040081 podStartE2EDuration="3.014040081s" podCreationTimestamp="2026-02-18 06:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:09.009439219 +0000 UTC m=+1006.178527471" watchObservedRunningTime="2026-02-18 06:05:09.014040081 +0000 UTC m=+1006.183128303" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.036856 4869 scope.go:117] "RemoveContainer" containerID="ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.036965 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:09 crc kubenswrapper[4869]: E0218 06:05:09.037288 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082\": container with ID starting with ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082 not found: ID does not exist" containerID="ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.037312 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082"} err="failed to get container status \"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082\": rpc error: code = NotFound desc = could not find container \"ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082\": container with ID starting with ce52eed908d69e05ae1a26540a89c91f4da5ec01ddeb1ebea33959a6e9462082 not found: ID does not exist" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.037330 4869 scope.go:117] "RemoveContainer" containerID="cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0" Feb 18 06:05:09 crc kubenswrapper[4869]: E0218 06:05:09.037662 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0\": container with ID starting with cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0 not found: ID does not exist" containerID="cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.037680 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0"} err="failed to get container status \"cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0\": rpc error: code = NotFound desc = could not find container \"cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0\": container with ID starting with cef2dbaee561675650073857182769e47405bea0bc703ac17deac384537f54a0 not found: ID does not exist" Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.044425 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vhlbk"] Feb 18 06:05:09 crc kubenswrapper[4869]: I0218 06:05:09.479289 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" path="/var/lib/kubelet/pods/841d160a-6d76-4977-9d93-ae2bc32e35a0/volumes" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.364980 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.738309 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dvmlc"] Feb 18 06:05:13 crc kubenswrapper[4869]: E0218 06:05:13.739006 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="dnsmasq-dns" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739024 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="dnsmasq-dns" Feb 18 06:05:13 crc kubenswrapper[4869]: E0218 06:05:13.739035 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="init" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739042 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="init" Feb 18 06:05:13 crc kubenswrapper[4869]: E0218 06:05:13.739054 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfb4ffc-626d-4998-8c3e-422f3f41ac00" containerName="mariadb-account-create-update" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739060 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfb4ffc-626d-4998-8c3e-422f3f41ac00" containerName="mariadb-account-create-update" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739213 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfb4ffc-626d-4998-8c3e-422f3f41ac00" containerName="mariadb-account-create-update" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739238 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="841d160a-6d76-4977-9d93-ae2bc32e35a0" containerName="dnsmasq-dns" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.739797 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.763699 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dvmlc"] Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.785631 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.786317 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27x8\" (UniqueName: \"kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.859515 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bede-account-create-update-tsvv4"] Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.860547 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.870478 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bede-account-create-update-tsvv4"] Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.879343 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.887648 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27x8\" (UniqueName: \"kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.887728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.888398 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.942403 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27x8\" (UniqueName: \"kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8\") pod \"cinder-db-create-dvmlc\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.950451 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rskmj"] Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.951537 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.972854 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-50c2-account-create-update-fv8ng"] Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.973921 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.985024 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.988843 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskkp\" (UniqueName: \"kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.988947 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:13 crc kubenswrapper[4869]: I0218 06:05:13.990833 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rskmj"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.000196 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-50c2-account-create-update-fv8ng"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.039689 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5qhrw"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.040790 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.051192 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5qhrw"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.059016 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091441 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84p9\" (UniqueName: \"kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091487 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4728n\" (UniqueName: \"kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091522 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskkp\" (UniqueName: \"kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091563 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091597 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091629 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rzsw\" (UniqueName: \"kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091652 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.091690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.092796 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.118188 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-94jf2"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.129335 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskkp\" (UniqueName: \"kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp\") pod \"barbican-bede-account-create-update-tsvv4\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.137305 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.142019 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h7tvj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.142310 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.142529 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.144243 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-94jf2"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.144435 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.181777 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.182555 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e36c-account-create-update-49m75"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.183551 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.185928 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.192447 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e36c-account-create-update-49m75"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193382 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193416 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4728n\" (UniqueName: \"kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193439 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84p9\" (UniqueName: \"kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193487 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193515 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193542 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br8fw\" (UniqueName: \"kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193568 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rzsw\" (UniqueName: \"kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193591 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.193637 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.203697 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.204377 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.205188 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.214609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4728n\" (UniqueName: \"kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n\") pod \"neutron-db-create-5qhrw\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.224453 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rzsw\" (UniqueName: \"kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw\") pod \"cinder-50c2-account-create-update-fv8ng\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.232212 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84p9\" (UniqueName: \"kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9\") pod \"barbican-db-create-rskmj\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.278233 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.288246 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.295616 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.295679 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br8fw\" (UniqueName: \"kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.295772 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jc42\" (UniqueName: \"kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.295805 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.295862 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.302211 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.302734 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.319058 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br8fw\" (UniqueName: \"kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw\") pod \"keystone-db-sync-94jf2\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.354508 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.399736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.399922 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc42\" (UniqueName: \"kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.400762 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.444401 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jc42\" (UniqueName: \"kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42\") pod \"neutron-e36c-account-create-update-49m75\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.510086 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.555169 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.667915 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dvmlc"] Feb 18 06:05:14 crc kubenswrapper[4869]: W0218 06:05:14.669715 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05aa0f74_aab3_44e4_805b_4d4df0c86c5b.slice/crio-6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37 WatchSource:0}: Error finding container 6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37: Status 404 returned error can't find the container with id 6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37 Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.745182 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bede-account-create-update-tsvv4"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.928903 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5qhrw"] Feb 18 06:05:14 crc kubenswrapper[4869]: W0218 06:05:14.936129 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d7b2c26_9c52_4095_a3a4_9e97e69d8cda.slice/crio-2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313 WatchSource:0}: Error finding container 2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313: Status 404 returned error can't find the container with id 2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313 Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.976931 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rskmj"] Feb 18 06:05:14 crc kubenswrapper[4869]: I0218 06:05:14.982851 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-50c2-account-create-update-fv8ng"] Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.056049 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dvmlc" event={"ID":"05aa0f74-aab3-44e4-805b-4d4df0c86c5b","Type":"ContainerStarted","Data":"a8ead2ac67d66baf8286361697a3fa7efc5ff6491ae73c99b9ec749774cf5062"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.056094 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dvmlc" event={"ID":"05aa0f74-aab3-44e4-805b-4d4df0c86c5b","Type":"ContainerStarted","Data":"6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.057636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-50c2-account-create-update-fv8ng" event={"ID":"8a9c7e54-b859-4ec1-84ea-65b575a5bb54","Type":"ContainerStarted","Data":"5164618db19957dbe3448135bc75072f4996a8d4943f3e2471b21eb93fc30837"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.061237 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bede-account-create-update-tsvv4" event={"ID":"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e","Type":"ContainerStarted","Data":"4ea7de77e4efd092d0be0efeb21b1d05d57f4a0c8c797f8026ad3a3a526d4420"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.061265 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bede-account-create-update-tsvv4" event={"ID":"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e","Type":"ContainerStarted","Data":"3d69450e49be6ff942838ae90fbdd020828d10cebff2d746adc05544e8499904"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.064529 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5qhrw" event={"ID":"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda","Type":"ContainerStarted","Data":"2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.078344 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rskmj" event={"ID":"c76f6717-a534-466b-8c8b-42bf00e770e8","Type":"ContainerStarted","Data":"7c20ad9ec18f7503e81186d8a2442a676c951c2b95f29bf979546b21021aa126"} Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.092073 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e36c-account-create-update-49m75"] Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.105149 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-bede-account-create-update-tsvv4" podStartSLOduration=2.105126129 podStartE2EDuration="2.105126129s" podCreationTimestamp="2026-02-18 06:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:15.100708311 +0000 UTC m=+1012.269796543" watchObservedRunningTime="2026-02-18 06:05:15.105126129 +0000 UTC m=+1012.274214361" Feb 18 06:05:15 crc kubenswrapper[4869]: I0218 06:05:15.117319 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-94jf2"] Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.088884 4869 generic.go:334] "Generic (PLEG): container finished" podID="544d6a86-e6d4-47b4-916e-2dfbe467b5f6" containerID="4a358cdd903eb3bbbe90c423fd331e1493e6ee1d4038d13975c0f8e86d5083c8" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.088958 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e36c-account-create-update-49m75" event={"ID":"544d6a86-e6d4-47b4-916e-2dfbe467b5f6","Type":"ContainerDied","Data":"4a358cdd903eb3bbbe90c423fd331e1493e6ee1d4038d13975c0f8e86d5083c8"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.088991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e36c-account-create-update-49m75" event={"ID":"544d6a86-e6d4-47b4-916e-2dfbe467b5f6","Type":"ContainerStarted","Data":"93148a6887f500d4249e6570ac4db97824a2a4d48fcfad3b69039ed1602eb087"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.093631 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-94jf2" event={"ID":"173c6fc9-4198-471e-b6f3-a7445d402034","Type":"ContainerStarted","Data":"2302b9692daa5680fd350b485a47db0175235f11c999333549871700fe00a634"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.097672 4869 generic.go:334] "Generic (PLEG): container finished" podID="8a9c7e54-b859-4ec1-84ea-65b575a5bb54" containerID="6bd55a076bf948119308b606a4b72cd5d4a902ac4fd363c8c44252eb06ca1987" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.097853 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-50c2-account-create-update-fv8ng" event={"ID":"8a9c7e54-b859-4ec1-84ea-65b575a5bb54","Type":"ContainerDied","Data":"6bd55a076bf948119308b606a4b72cd5d4a902ac4fd363c8c44252eb06ca1987"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.100351 4869 generic.go:334] "Generic (PLEG): container finished" podID="ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" containerID="4ea7de77e4efd092d0be0efeb21b1d05d57f4a0c8c797f8026ad3a3a526d4420" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.100390 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bede-account-create-update-tsvv4" event={"ID":"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e","Type":"ContainerDied","Data":"4ea7de77e4efd092d0be0efeb21b1d05d57f4a0c8c797f8026ad3a3a526d4420"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.101996 4869 generic.go:334] "Generic (PLEG): container finished" podID="5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" containerID="a3781a690a1820eed2cbdaf2377feaf6a61f019fe05b12e09a38e10053f32645" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.102172 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5qhrw" event={"ID":"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda","Type":"ContainerDied","Data":"a3781a690a1820eed2cbdaf2377feaf6a61f019fe05b12e09a38e10053f32645"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.134810 4869 generic.go:334] "Generic (PLEG): container finished" podID="c76f6717-a534-466b-8c8b-42bf00e770e8" containerID="32055da025d3efbc8424f8602561e6471dc62fb2fd8e676bfbf9903d0a6c8873" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.134866 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rskmj" event={"ID":"c76f6717-a534-466b-8c8b-42bf00e770e8","Type":"ContainerDied","Data":"32055da025d3efbc8424f8602561e6471dc62fb2fd8e676bfbf9903d0a6c8873"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.136351 4869 generic.go:334] "Generic (PLEG): container finished" podID="05aa0f74-aab3-44e4-805b-4d4df0c86c5b" containerID="a8ead2ac67d66baf8286361697a3fa7efc5ff6491ae73c99b9ec749774cf5062" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.136376 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dvmlc" event={"ID":"05aa0f74-aab3-44e4-805b-4d4df0c86c5b","Type":"ContainerDied","Data":"a8ead2ac67d66baf8286361697a3fa7efc5ff6491ae73c99b9ec749774cf5062"} Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.511564 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.637194 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts\") pod \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.637530 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z27x8\" (UniqueName: \"kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8\") pod \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\" (UID: \"05aa0f74-aab3-44e4-805b-4d4df0c86c5b\") " Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.638200 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05aa0f74-aab3-44e4-805b-4d4df0c86c5b" (UID: "05aa0f74-aab3-44e4-805b-4d4df0c86c5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.651091 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8" (OuterVolumeSpecName: "kube-api-access-z27x8") pod "05aa0f74-aab3-44e4-805b-4d4df0c86c5b" (UID: "05aa0f74-aab3-44e4-805b-4d4df0c86c5b"). InnerVolumeSpecName "kube-api-access-z27x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.739729 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z27x8\" (UniqueName: \"kubernetes.io/projected/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-kube-api-access-z27x8\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.740052 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05aa0f74-aab3-44e4-805b-4d4df0c86c5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.806963 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.885454 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:05:16 crc kubenswrapper[4869]: I0218 06:05:16.885718 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-rx7zr" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="dnsmasq-dns" containerID="cri-o://fa6bcbdc86a045336486a638bdc9d78200e2ed1d9973d63f4978d97356fa688c" gracePeriod=10 Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.147121 4869 generic.go:334] "Generic (PLEG): container finished" podID="bf2194de-4e6e-4d75-814f-480d69124118" containerID="fa6bcbdc86a045336486a638bdc9d78200e2ed1d9973d63f4978d97356fa688c" exitCode=0 Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.147185 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx7zr" event={"ID":"bf2194de-4e6e-4d75-814f-480d69124118","Type":"ContainerDied","Data":"fa6bcbdc86a045336486a638bdc9d78200e2ed1d9973d63f4978d97356fa688c"} Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.149388 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dvmlc" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.149832 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dvmlc" event={"ID":"05aa0f74-aab3-44e4-805b-4d4df0c86c5b","Type":"ContainerDied","Data":"6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37"} Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.149963 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb258db0e59b8682016881f9aaa362c28a69d4b60085669d2a975b910b24b37" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.349954 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.454594 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb\") pod \"bf2194de-4e6e-4d75-814f-480d69124118\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.454713 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config\") pod \"bf2194de-4e6e-4d75-814f-480d69124118\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.454962 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6vz\" (UniqueName: \"kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz\") pod \"bf2194de-4e6e-4d75-814f-480d69124118\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.454990 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc\") pod \"bf2194de-4e6e-4d75-814f-480d69124118\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.455007 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb\") pod \"bf2194de-4e6e-4d75-814f-480d69124118\" (UID: \"bf2194de-4e6e-4d75-814f-480d69124118\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.464062 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz" (OuterVolumeSpecName: "kube-api-access-9q6vz") pod "bf2194de-4e6e-4d75-814f-480d69124118" (UID: "bf2194de-4e6e-4d75-814f-480d69124118"). InnerVolumeSpecName "kube-api-access-9q6vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.539484 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf2194de-4e6e-4d75-814f-480d69124118" (UID: "bf2194de-4e6e-4d75-814f-480d69124118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.539493 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf2194de-4e6e-4d75-814f-480d69124118" (UID: "bf2194de-4e6e-4d75-814f-480d69124118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.557235 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6vz\" (UniqueName: \"kubernetes.io/projected/bf2194de-4e6e-4d75-814f-480d69124118-kube-api-access-9q6vz\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.557270 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.557282 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.570886 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf2194de-4e6e-4d75-814f-480d69124118" (UID: "bf2194de-4e6e-4d75-814f-480d69124118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.572867 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config" (OuterVolumeSpecName: "config") pod "bf2194de-4e6e-4d75-814f-480d69124118" (UID: "bf2194de-4e6e-4d75-814f-480d69124118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.578815 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.658917 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jc42\" (UniqueName: \"kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42\") pod \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.659070 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts\") pod \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\" (UID: \"544d6a86-e6d4-47b4-916e-2dfbe467b5f6\") " Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.659373 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.659384 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf2194de-4e6e-4d75-814f-480d69124118-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.659501 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "544d6a86-e6d4-47b4-916e-2dfbe467b5f6" (UID: "544d6a86-e6d4-47b4-916e-2dfbe467b5f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.661814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42" (OuterVolumeSpecName: "kube-api-access-5jc42") pod "544d6a86-e6d4-47b4-916e-2dfbe467b5f6" (UID: "544d6a86-e6d4-47b4-916e-2dfbe467b5f6"). InnerVolumeSpecName "kube-api-access-5jc42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.760765 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jc42\" (UniqueName: \"kubernetes.io/projected/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-kube-api-access-5jc42\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4869]: I0218 06:05:17.760806 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/544d6a86-e6d4-47b4-916e-2dfbe467b5f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.170563 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e36c-account-create-update-49m75" Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.170553 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e36c-account-create-update-49m75" event={"ID":"544d6a86-e6d4-47b4-916e-2dfbe467b5f6","Type":"ContainerDied","Data":"93148a6887f500d4249e6570ac4db97824a2a4d48fcfad3b69039ed1602eb087"} Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.171266 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93148a6887f500d4249e6570ac4db97824a2a4d48fcfad3b69039ed1602eb087" Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.181132 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx7zr" event={"ID":"bf2194de-4e6e-4d75-814f-480d69124118","Type":"ContainerDied","Data":"3cbc467da7336a49b83d45874d562aa4b46be9e90e410b6ca05405081830f9d6"} Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.181399 4869 scope.go:117] "RemoveContainer" containerID="fa6bcbdc86a045336486a638bdc9d78200e2ed1d9973d63f4978d97356fa688c" Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.181227 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx7zr" Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.231006 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:05:18 crc kubenswrapper[4869]: I0218 06:05:18.244940 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx7zr"] Feb 18 06:05:19 crc kubenswrapper[4869]: I0218 06:05:19.480821 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2194de-4e6e-4d75-814f-480d69124118" path="/var/lib/kubelet/pods/bf2194de-4e6e-4d75-814f-480d69124118/volumes" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.038859 4869 scope.go:117] "RemoveContainer" containerID="c6dec9587fa8660ba7649c549d4f1d4bafc6c3c14e4fe0122fac7538fb2dedc0" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.205474 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bede-account-create-update-tsvv4" event={"ID":"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e","Type":"ContainerDied","Data":"3d69450e49be6ff942838ae90fbdd020828d10cebff2d746adc05544e8499904"} Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.205528 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d69450e49be6ff942838ae90fbdd020828d10cebff2d746adc05544e8499904" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.208518 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5qhrw" event={"ID":"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda","Type":"ContainerDied","Data":"2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313"} Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.208553 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbf298e06074977a3c30cf7944b20a7e146696e17224fe3081c537a26c7e313" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.210472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rskmj" event={"ID":"c76f6717-a534-466b-8c8b-42bf00e770e8","Type":"ContainerDied","Data":"7c20ad9ec18f7503e81186d8a2442a676c951c2b95f29bf979546b21021aa126"} Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.210500 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c20ad9ec18f7503e81186d8a2442a676c951c2b95f29bf979546b21021aa126" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.211849 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-50c2-account-create-update-fv8ng" event={"ID":"8a9c7e54-b859-4ec1-84ea-65b575a5bb54","Type":"ContainerDied","Data":"5164618db19957dbe3448135bc75072f4996a8d4943f3e2471b21eb93fc30837"} Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.211870 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5164618db19957dbe3448135bc75072f4996a8d4943f3e2471b21eb93fc30837" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.329133 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.406854 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts\") pod \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.407346 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rzsw\" (UniqueName: \"kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw\") pod \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\" (UID: \"8a9c7e54-b859-4ec1-84ea-65b575a5bb54\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.407576 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a9c7e54-b859-4ec1-84ea-65b575a5bb54" (UID: "8a9c7e54-b859-4ec1-84ea-65b575a5bb54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.408058 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.422878 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw" (OuterVolumeSpecName: "kube-api-access-9rzsw") pod "8a9c7e54-b859-4ec1-84ea-65b575a5bb54" (UID: "8a9c7e54-b859-4ec1-84ea-65b575a5bb54"). InnerVolumeSpecName "kube-api-access-9rzsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.424121 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.452423 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.463829 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509036 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts\") pod \"c76f6717-a534-466b-8c8b-42bf00e770e8\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509171 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sskkp\" (UniqueName: \"kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp\") pod \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509248 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84p9\" (UniqueName: \"kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9\") pod \"c76f6717-a534-466b-8c8b-42bf00e770e8\" (UID: \"c76f6717-a534-466b-8c8b-42bf00e770e8\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509275 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4728n\" (UniqueName: \"kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n\") pod \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509429 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts\") pod \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\" (UID: \"5d7b2c26-9c52-4095-a3a4-9e97e69d8cda\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509510 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts\") pod \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\" (UID: \"ddfad6e9-7d65-4a60-9bbd-c6b552167a4e\") " Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.509809 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c76f6717-a534-466b-8c8b-42bf00e770e8" (UID: "c76f6717-a534-466b-8c8b-42bf00e770e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.510199 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rzsw\" (UniqueName: \"kubernetes.io/projected/8a9c7e54-b859-4ec1-84ea-65b575a5bb54-kube-api-access-9rzsw\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.510221 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c76f6717-a534-466b-8c8b-42bf00e770e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.510195 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" (UID: "5d7b2c26-9c52-4095-a3a4-9e97e69d8cda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.510333 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" (UID: "ddfad6e9-7d65-4a60-9bbd-c6b552167a4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.512062 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp" (OuterVolumeSpecName: "kube-api-access-sskkp") pod "ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" (UID: "ddfad6e9-7d65-4a60-9bbd-c6b552167a4e"). InnerVolumeSpecName "kube-api-access-sskkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.512274 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9" (OuterVolumeSpecName: "kube-api-access-x84p9") pod "c76f6717-a534-466b-8c8b-42bf00e770e8" (UID: "c76f6717-a534-466b-8c8b-42bf00e770e8"). InnerVolumeSpecName "kube-api-access-x84p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.513955 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n" (OuterVolumeSpecName: "kube-api-access-4728n") pod "5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" (UID: "5d7b2c26-9c52-4095-a3a4-9e97e69d8cda"). InnerVolumeSpecName "kube-api-access-4728n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.611860 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.611898 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.611907 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sskkp\" (UniqueName: \"kubernetes.io/projected/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e-kube-api-access-sskkp\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.611918 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84p9\" (UniqueName: \"kubernetes.io/projected/c76f6717-a534-466b-8c8b-42bf00e770e8-kube-api-access-x84p9\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:20 crc kubenswrapper[4869]: I0218 06:05:20.611926 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4728n\" (UniqueName: \"kubernetes.io/projected/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda-kube-api-access-4728n\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.222345 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-94jf2" event={"ID":"173c6fc9-4198-471e-b6f3-a7445d402034","Type":"ContainerStarted","Data":"9fcc2fedf44bd0e3860fbb34618e08f8de235f38893936bbf81e8f163ce96ab6"} Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.223600 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5qhrw" Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.223618 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bede-account-create-update-tsvv4" Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.224248 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-50c2-account-create-update-fv8ng" Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.224297 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rskmj" Feb 18 06:05:21 crc kubenswrapper[4869]: I0218 06:05:21.246487 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-94jf2" podStartSLOduration=2.334816398 podStartE2EDuration="7.246465068s" podCreationTimestamp="2026-02-18 06:05:14 +0000 UTC" firstStartedPulling="2026-02-18 06:05:15.173528851 +0000 UTC m=+1012.342617083" lastFinishedPulling="2026-02-18 06:05:20.085177521 +0000 UTC m=+1017.254265753" observedRunningTime="2026-02-18 06:05:21.237436358 +0000 UTC m=+1018.406524590" watchObservedRunningTime="2026-02-18 06:05:21.246465068 +0000 UTC m=+1018.415553300" Feb 18 06:05:23 crc kubenswrapper[4869]: I0218 06:05:23.240846 4869 generic.go:334] "Generic (PLEG): container finished" podID="173c6fc9-4198-471e-b6f3-a7445d402034" containerID="9fcc2fedf44bd0e3860fbb34618e08f8de235f38893936bbf81e8f163ce96ab6" exitCode=0 Feb 18 06:05:23 crc kubenswrapper[4869]: I0218 06:05:23.240894 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-94jf2" event={"ID":"173c6fc9-4198-471e-b6f3-a7445d402034","Type":"ContainerDied","Data":"9fcc2fedf44bd0e3860fbb34618e08f8de235f38893936bbf81e8f163ce96ab6"} Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.603444 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.682919 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br8fw\" (UniqueName: \"kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw\") pod \"173c6fc9-4198-471e-b6f3-a7445d402034\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.683027 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data\") pod \"173c6fc9-4198-471e-b6f3-a7445d402034\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.683224 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle\") pod \"173c6fc9-4198-471e-b6f3-a7445d402034\" (UID: \"173c6fc9-4198-471e-b6f3-a7445d402034\") " Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.689396 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw" (OuterVolumeSpecName: "kube-api-access-br8fw") pod "173c6fc9-4198-471e-b6f3-a7445d402034" (UID: "173c6fc9-4198-471e-b6f3-a7445d402034"). InnerVolumeSpecName "kube-api-access-br8fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.713113 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "173c6fc9-4198-471e-b6f3-a7445d402034" (UID: "173c6fc9-4198-471e-b6f3-a7445d402034"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.753894 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data" (OuterVolumeSpecName: "config-data") pod "173c6fc9-4198-471e-b6f3-a7445d402034" (UID: "173c6fc9-4198-471e-b6f3-a7445d402034"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.785062 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br8fw\" (UniqueName: \"kubernetes.io/projected/173c6fc9-4198-471e-b6f3-a7445d402034-kube-api-access-br8fw\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.785103 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:24 crc kubenswrapper[4869]: I0218 06:05:24.785118 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173c6fc9-4198-471e-b6f3-a7445d402034-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.260055 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-94jf2" event={"ID":"173c6fc9-4198-471e-b6f3-a7445d402034","Type":"ContainerDied","Data":"2302b9692daa5680fd350b485a47db0175235f11c999333549871700fe00a634"} Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.260113 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2302b9692daa5680fd350b485a47db0175235f11c999333549871700fe00a634" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.260172 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-94jf2" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456100 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x946f"] Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456620 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173c6fc9-4198-471e-b6f3-a7445d402034" containerName="keystone-db-sync" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456640 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="173c6fc9-4198-471e-b6f3-a7445d402034" containerName="keystone-db-sync" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456671 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05aa0f74-aab3-44e4-805b-4d4df0c86c5b" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456679 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aa0f74-aab3-44e4-805b-4d4df0c86c5b" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456698 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544d6a86-e6d4-47b4-916e-2dfbe467b5f6" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456706 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="544d6a86-e6d4-47b4-916e-2dfbe467b5f6" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456718 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76f6717-a534-466b-8c8b-42bf00e770e8" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456727 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76f6717-a534-466b-8c8b-42bf00e770e8" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456752 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456759 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456766 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="init" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456772 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="init" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456788 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9c7e54-b859-4ec1-84ea-65b575a5bb54" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456795 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9c7e54-b859-4ec1-84ea-65b575a5bb54" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456810 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="dnsmasq-dns" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456821 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="dnsmasq-dns" Feb 18 06:05:25 crc kubenswrapper[4869]: E0218 06:05:25.456829 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.456837 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457004 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9c7e54-b859-4ec1-84ea-65b575a5bb54" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457017 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="544d6a86-e6d4-47b4-916e-2dfbe467b5f6" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457027 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="05aa0f74-aab3-44e4-805b-4d4df0c86c5b" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457040 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2194de-4e6e-4d75-814f-480d69124118" containerName="dnsmasq-dns" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457050 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" containerName="mariadb-account-create-update" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457060 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="173c6fc9-4198-471e-b6f3-a7445d402034" containerName="keystone-db-sync" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457068 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76f6717-a534-466b-8c8b-42bf00e770e8" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457077 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" containerName="mariadb-database-create" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.457717 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.460930 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.461664 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.462028 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.462210 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h7tvj" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.463202 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.492075 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.511920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.512044 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.512100 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.512132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.512158 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljtxm\" (UniqueName: \"kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.512217 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.518200 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.518329 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.529087 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x946f"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615017 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljtxm\" (UniqueName: \"kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615106 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615129 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615197 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.615261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.619525 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.620171 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.621698 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.629726 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.629924 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.651728 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.652312 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljtxm\" (UniqueName: \"kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm\") pod \"keystone-bootstrap-x946f\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.664869 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.688044 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.688835 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.688925 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f7mvc" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.688962 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716491 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26xx\" (UniqueName: \"kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716591 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716627 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.716690 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.731027 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7cn7l"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.732414 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.739814 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkh8t" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.740136 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.740340 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.754911 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.791236 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.800898 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7cn7l"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.821800 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8v5fn"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.822961 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824772 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824820 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824868 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824889 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824950 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26xx\" (UniqueName: \"kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.824979 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.825009 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.825029 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.825046 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhsb\" (UniqueName: \"kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.825886 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.826473 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.826975 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.827702 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.838228 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8v5fn"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.840388 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.840958 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zklr2" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.841542 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.844244 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.877025 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.879672 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.885728 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.886060 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.897702 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gk2zt"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.898970 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.903813 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.904112 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.904363 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fnpts" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.906329 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26xx\" (UniqueName: \"kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx\") pod \"dnsmasq-dns-847c4cc679-mmfz8\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.926940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhsb\" (UniqueName: \"kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.926992 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927048 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gk2zt"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927053 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkw8\" (UniqueName: \"kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927157 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927190 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927213 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927264 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927313 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxn66\" (UniqueName: \"kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927330 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927425 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927558 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.927923 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.928730 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.928972 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.950385 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.955938 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.957664 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.981174 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:25 crc kubenswrapper[4869]: I0218 06:05:25.996482 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhsb\" (UniqueName: \"kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb\") pod \"horizon-745b5b966f-7mdzf\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.014838 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.015500 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.033117 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.033520 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.034245 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkw8\" (UniqueName: \"kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038765 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038828 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fbh\" (UniqueName: \"kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038912 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.038987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039124 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxn66\" (UniqueName: \"kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039147 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039179 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039219 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrdw\" (UniqueName: \"kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039371 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039418 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039441 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039518 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.039572 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.045022 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.059818 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.072563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.090646 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkw8\" (UniqueName: \"kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.097791 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.103194 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.108081 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.109249 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.109578 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle\") pod \"neutron-db-sync-7cn7l\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.113964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxn66\" (UniqueName: \"kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.118526 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data\") pod \"cinder-db-sync-8v5fn\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142637 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzmn8\" (UniqueName: \"kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142751 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fbh\" (UniqueName: \"kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142773 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142793 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142819 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142840 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142861 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142885 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrdw\" (UniqueName: \"kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142904 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142954 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142980 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.142996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.143024 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.143041 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.154685 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.162411 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.163045 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.163772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.163965 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.169673 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.170485 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.170769 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.187631 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.191070 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.191625 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrdw\" (UniqueName: \"kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw\") pod \"placement-db-sync-gk2zt\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.208039 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.208491 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.208595 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.208662 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.208894 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.209305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fbh\" (UniqueName: \"kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh\") pod \"ceilometer-0\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.212639 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5sd9x" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.236152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254423 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brdhp\" (UniqueName: \"kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzmn8\" (UniqueName: \"kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254679 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254711 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254788 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254898 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.254953 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.255015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.255059 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.255122 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.256924 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.265406 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.265658 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.266725 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.281356 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.292395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzmn8\" (UniqueName: \"kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8\") pod \"horizon-76444b47f5-mc7kc\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.298988 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk2zt" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.319471 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.355665 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356628 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356667 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv7zm\" (UniqueName: \"kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356705 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356727 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356758 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356782 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356812 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356827 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356847 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brdhp\" (UniqueName: \"kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356866 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356914 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356933 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356950 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.356987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.357797 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.358562 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.359153 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.359872 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.360412 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.363578 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.374090 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.409475 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brdhp\" (UniqueName: \"kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp\") pod \"dnsmasq-dns-785d8bcb8c-pnf76\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461313 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv7zm\" (UniqueName: \"kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461457 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461488 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461553 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461580 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.461645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.463147 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.463396 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.465526 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.490426 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.544833 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zzcxf"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.547929 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.554780 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f5xlh" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.557671 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.592705 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.599083 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.600712 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.631636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv7zm\" (UniqueName: \"kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.631705 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.653154 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.658202 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zzcxf"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.679284 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.679368 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.679407 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hktd\" (UniqueName: \"kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.693797 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.699475 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.708584 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.709004 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.736865 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781207 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781264 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781288 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781497 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4fvm\" (UniqueName: \"kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781557 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781616 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781670 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hktd\" (UniqueName: \"kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.781772 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.794513 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.795954 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.813479 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hktd\" (UniqueName: \"kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd\") pod \"barbican-db-sync-zzcxf\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:26 crc kubenswrapper[4869]: I0218 06:05:26.856572 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895685 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895794 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895837 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895904 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4fvm\" (UniqueName: \"kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895943 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.895963 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.896000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.898567 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.899101 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.900713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.916397 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.917919 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.921266 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.923184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.958639 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4fvm\" (UniqueName: \"kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.970357 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x946f"] Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.974855 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:26.996015 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:05:27 crc kubenswrapper[4869]: W0218 06:05:27.001073 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod894a467c_9d46_46d3_966f_324c9327d618.slice/crio-7ab03f00a8ec77e410471bd0d81cab7329f4974ccf77679b126f51929551a247 WatchSource:0}: Error finding container 7ab03f00a8ec77e410471bd0d81cab7329f4974ccf77679b126f51929551a247: Status 404 returned error can't find the container with id 7ab03f00a8ec77e410471bd0d81cab7329f4974ccf77679b126f51929551a247 Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.025804 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.038698 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:27 crc kubenswrapper[4869]: W0218 06:05:27.072181 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8127655f_342a_4bfc_a5c3_0a44bfc3cb77.slice/crio-1667417614db1ddb3113785522c046ad2859005f5c04d8694ff6cd4bcf191787 WatchSource:0}: Error finding container 1667417614db1ddb3113785522c046ad2859005f5c04d8694ff6cd4bcf191787: Status 404 returned error can't find the container with id 1667417614db1ddb3113785522c046ad2859005f5c04d8694ff6cd4bcf191787 Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.224692 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8v5fn"] Feb 18 06:05:27 crc kubenswrapper[4869]: W0218 06:05:27.230751 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77d2d3cf_1108_468b_816a_64d29471542e.slice/crio-10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038 WatchSource:0}: Error finding container 10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038: Status 404 returned error can't find the container with id 10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038 Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.253694 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.337550 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x946f" event={"ID":"5bb7043f-ce57-401f-844e-b96417b8e219","Type":"ContainerStarted","Data":"cc0a7c6e939da768bd5956a41cdac243c14d475ab28870517e968bd5b869b21f"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.337618 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x946f" event={"ID":"5bb7043f-ce57-401f-844e-b96417b8e219","Type":"ContainerStarted","Data":"70e06859a3f8c9663edad92b1ee44c375049f77195dd29959674b28f3705cce9"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.345438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8v5fn" event={"ID":"77d2d3cf-1108-468b-816a-64d29471542e","Type":"ContainerStarted","Data":"10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.349278 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" event={"ID":"894a467c-9d46-46d3-966f-324c9327d618","Type":"ContainerStarted","Data":"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.349418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" event={"ID":"894a467c-9d46-46d3-966f-324c9327d618","Type":"ContainerStarted","Data":"7ab03f00a8ec77e410471bd0d81cab7329f4974ccf77679b126f51929551a247"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.353820 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745b5b966f-7mdzf" event={"ID":"8127655f-342a-4bfc-a5c3-0a44bfc3cb77","Type":"ContainerStarted","Data":"1667417614db1ddb3113785522c046ad2859005f5c04d8694ff6cd4bcf191787"} Feb 18 06:05:27 crc kubenswrapper[4869]: I0218 06:05:27.370529 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x946f" podStartSLOduration=2.370498664 podStartE2EDuration="2.370498664s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:27.360278026 +0000 UTC m=+1024.529366258" watchObservedRunningTime="2026-02-18 06:05:27.370498664 +0000 UTC m=+1024.539586896" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.030289 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gk2zt"] Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.045540 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50b8c093_2eb4_4220_b335_b5b94fb8776e.slice/crio-232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99 WatchSource:0}: Error finding container 232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99: Status 404 returned error can't find the container with id 232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99 Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.047721 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.064401 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7cn7l"] Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.114680 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc414ef60_f94a_4047_ad26_b7ca6fa3f93b.slice/crio-93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5 WatchSource:0}: Error finding container 93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5: Status 404 returned error can't find the container with id 93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5 Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.285201 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333310 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26xx\" (UniqueName: \"kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333399 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333622 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333683 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.333796 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb\") pod \"894a467c-9d46-46d3-966f-324c9327d618\" (UID: \"894a467c-9d46-46d3-966f-324c9327d618\") " Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.358005 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx" (OuterVolumeSpecName: "kube-api-access-w26xx") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "kube-api-access-w26xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.367122 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.383670 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.387789 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerStarted","Data":"4224db997a002b231eaa2717c1b6efdec1847ee9262636ac76560dd4d286df53"} Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.402231 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config" (OuterVolumeSpecName: "config") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.408479 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.413446 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk2zt" event={"ID":"50b8c093-2eb4-4220-b335-b5b94fb8776e","Type":"ContainerStarted","Data":"232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99"} Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.416707 4869 generic.go:334] "Generic (PLEG): container finished" podID="894a467c-9d46-46d3-966f-324c9327d618" containerID="1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c" exitCode=0 Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.416868 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" event={"ID":"894a467c-9d46-46d3-966f-324c9327d618","Type":"ContainerDied","Data":"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c"} Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.417042 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" event={"ID":"894a467c-9d46-46d3-966f-324c9327d618","Type":"ContainerDied","Data":"7ab03f00a8ec77e410471bd0d81cab7329f4974ccf77679b126f51929551a247"} Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.417116 4869 scope.go:117] "RemoveContainer" containerID="1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.417365 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-mmfz8" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.418355 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.430044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7cn7l" event={"ID":"c414ef60-f94a-4047-ad26-b7ca6fa3f93b","Type":"ContainerStarted","Data":"93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5"} Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.437601 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.438612 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.438650 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.438661 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.438671 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26xx\" (UniqueName: \"kubernetes.io/projected/894a467c-9d46-46d3-966f-324c9327d618-kube-api-access-w26xx\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.438680 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.439366 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89712bb0_67ea_4538_9b51_82a5b629c048.slice/crio-54562d11f55656054de8c934b1115eb1ccf080952299f5e65a78355e66d4918b WatchSource:0}: Error finding container 54562d11f55656054de8c934b1115eb1ccf080952299f5e65a78355e66d4918b: Status 404 returned error can't find the container with id 54562d11f55656054de8c934b1115eb1ccf080952299f5e65a78355e66d4918b Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.440069 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "894a467c-9d46-46d3-966f-324c9327d618" (UID: "894a467c-9d46-46d3-966f-324c9327d618"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.457910 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4fa15d8_bacc_4ce0_bd25_41e451404ab3.slice/crio-a87ca12110732479bd31bc54a3166cf1a592f9c7963d09e005d603c4b93c5196 WatchSource:0}: Error finding container a87ca12110732479bd31bc54a3166cf1a592f9c7963d09e005d603c4b93c5196: Status 404 returned error can't find the container with id a87ca12110732479bd31bc54a3166cf1a592f9c7963d09e005d603c4b93c5196 Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.462988 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zzcxf"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.465034 4869 scope.go:117] "RemoveContainer" containerID="1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c" Feb 18 06:05:28 crc kubenswrapper[4869]: E0218 06:05:28.466144 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c\": container with ID starting with 1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c not found: ID does not exist" containerID="1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.466171 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c"} err="failed to get container status \"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c\": rpc error: code = NotFound desc = could not find container \"1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c\": container with ID starting with 1bd88a41774bbc6a81ceaac78b8ba534849e24d405e97844b8236c79c183382c not found: ID does not exist" Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.466690 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7cn7l" podStartSLOduration=3.466671089 podStartE2EDuration="3.466671089s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:28.457843224 +0000 UTC m=+1025.626931456" watchObservedRunningTime="2026-02-18 06:05:28.466671089 +0000 UTC m=+1025.635759321" Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.467786 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffdffd9a_f626_4bf2_b1e0_104eca55e7f5.slice/crio-c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e WatchSource:0}: Error finding container c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e: Status 404 returned error can't find the container with id c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.526885 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.540460 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/894a467c-9d46-46d3-966f-324c9327d618-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:28 crc kubenswrapper[4869]: W0218 06:05:28.555970 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0de7d58_05cc_44bb_93f7_d74404c4ea56.slice/crio-f01002c44f133e802445e87c5691a78a90c57ba9cee6fe8242dee13bbf86092f WatchSource:0}: Error finding container f01002c44f133e802445e87c5691a78a90c57ba9cee6fe8242dee13bbf86092f: Status 404 returned error can't find the container with id f01002c44f133e802445e87c5691a78a90c57ba9cee6fe8242dee13bbf86092f Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.885142 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.914172 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-mmfz8"] Feb 18 06:05:28 crc kubenswrapper[4869]: I0218 06:05:28.932049 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.014838 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.033166 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.049189 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.078801 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:05:29 crc kubenswrapper[4869]: E0218 06:05:29.079272 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894a467c-9d46-46d3-966f-324c9327d618" containerName="init" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.079288 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="894a467c-9d46-46d3-966f-324c9327d618" containerName="init" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.079454 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="894a467c-9d46-46d3-966f-324c9327d618" containerName="init" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.080519 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.117574 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.160353 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9nr\" (UniqueName: \"kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.160456 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.160520 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.160555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.160596 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.225965 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.264365 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.264452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.264495 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9nr\" (UniqueName: \"kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.264550 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.264635 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.265768 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.265883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.267207 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.274832 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.285245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9nr\" (UniqueName: \"kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr\") pod \"horizon-588975f6bf-9447r\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.418462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.455209 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerStarted","Data":"5d5f40ef4d50fa74f787b6d6b40be9947bc5f65b824c7481b2689ffc369e9b74"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.460132 4869 generic.go:334] "Generic (PLEG): container finished" podID="89712bb0-67ea-4538-9b51-82a5b629c048" containerID="b28392a3421514293f0ad34c0a53497dba87ca0d5e150417d6a07bc265c84277" exitCode=0 Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.460228 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" event={"ID":"89712bb0-67ea-4538-9b51-82a5b629c048","Type":"ContainerDied","Data":"b28392a3421514293f0ad34c0a53497dba87ca0d5e150417d6a07bc265c84277"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.460263 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" event={"ID":"89712bb0-67ea-4538-9b51-82a5b629c048","Type":"ContainerStarted","Data":"54562d11f55656054de8c934b1115eb1ccf080952299f5e65a78355e66d4918b"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.463383 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerStarted","Data":"a87ca12110732479bd31bc54a3166cf1a592f9c7963d09e005d603c4b93c5196"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.555514 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894a467c-9d46-46d3-966f-324c9327d618" path="/var/lib/kubelet/pods/894a467c-9d46-46d3-966f-324c9327d618/volumes" Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.556553 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7cn7l" event={"ID":"c414ef60-f94a-4047-ad26-b7ca6fa3f93b","Type":"ContainerStarted","Data":"daff6c6b03f528b83967bcbb8355a5fc18ffb7d072cccff4c7d7d4e3ba44ca7d"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.561005 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerStarted","Data":"f01002c44f133e802445e87c5691a78a90c57ba9cee6fe8242dee13bbf86092f"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.561024 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzcxf" event={"ID":"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5","Type":"ContainerStarted","Data":"c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e"} Feb 18 06:05:29 crc kubenswrapper[4869]: I0218 06:05:29.976951 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:05:30 crc kubenswrapper[4869]: W0218 06:05:30.037939 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9571fd_7aa2_4e30_81f9_465a9c4291c8.slice/crio-74f418bd00b17e46b978b01deb0d626d6353739e0224c848a7e2fd3390f8ef04 WatchSource:0}: Error finding container 74f418bd00b17e46b978b01deb0d626d6353739e0224c848a7e2fd3390f8ef04: Status 404 returned error can't find the container with id 74f418bd00b17e46b978b01deb0d626d6353739e0224c848a7e2fd3390f8ef04 Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.540651 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerStarted","Data":"74f418bd00b17e46b978b01deb0d626d6353739e0224c848a7e2fd3390f8ef04"} Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.548226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerStarted","Data":"7816ce0a15d7c39b2a7aab31b12c49bf3c956380c139ce57b0f6abe2c0fffc74"} Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.551930 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerStarted","Data":"5307b8bc58eae29370fac3f174b160e2bcf3d4deed155a2c7de5b81bb8e3dd8f"} Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.568818 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" event={"ID":"89712bb0-67ea-4538-9b51-82a5b629c048","Type":"ContainerStarted","Data":"f6fba7a1a763e656bb1ccede00c69b1cdeb2d81601366d2e062fa7b6a19e8a77"} Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.569144 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:30 crc kubenswrapper[4869]: I0218 06:05:30.591557 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" podStartSLOduration=5.591516217 podStartE2EDuration="5.591516217s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:30.589232692 +0000 UTC m=+1027.758320924" watchObservedRunningTime="2026-02-18 06:05:30.591516217 +0000 UTC m=+1027.760604489" Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.587393 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerStarted","Data":"295374791c43887f553c0032b6a4f6bb2a9d05879bb74f5633032af93b66511f"} Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.587860 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-log" containerID="cri-o://7816ce0a15d7c39b2a7aab31b12c49bf3c956380c139ce57b0f6abe2c0fffc74" gracePeriod=30 Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.588359 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-httpd" containerID="cri-o://295374791c43887f553c0032b6a4f6bb2a9d05879bb74f5633032af93b66511f" gracePeriod=30 Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.615292 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-log" containerID="cri-o://5307b8bc58eae29370fac3f174b160e2bcf3d4deed155a2c7de5b81bb8e3dd8f" gracePeriod=30 Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.615421 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-httpd" containerID="cri-o://e9aa428c4be69c0e0b15838600b79a6b13e4ab9a1ec45b395bbbfc917ff1d817" gracePeriod=30 Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.615313 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerStarted","Data":"e9aa428c4be69c0e0b15838600b79a6b13e4ab9a1ec45b395bbbfc917ff1d817"} Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.649554 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.649534095 podStartE2EDuration="5.649534095s" podCreationTimestamp="2026-02-18 06:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:31.62712106 +0000 UTC m=+1028.796209302" watchObservedRunningTime="2026-02-18 06:05:31.649534095 +0000 UTC m=+1028.818622327" Feb 18 06:05:31 crc kubenswrapper[4869]: I0218 06:05:31.684049 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.684021092 podStartE2EDuration="6.684021092s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:31.656163315 +0000 UTC m=+1028.825251547" watchObservedRunningTime="2026-02-18 06:05:31.684021092 +0000 UTC m=+1028.853109324" Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.629383 4869 generic.go:334] "Generic (PLEG): container finished" podID="5bb7043f-ce57-401f-844e-b96417b8e219" containerID="cc0a7c6e939da768bd5956a41cdac243c14d475ab28870517e968bd5b869b21f" exitCode=0 Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.629466 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x946f" event={"ID":"5bb7043f-ce57-401f-844e-b96417b8e219","Type":"ContainerDied","Data":"cc0a7c6e939da768bd5956a41cdac243c14d475ab28870517e968bd5b869b21f"} Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.641760 4869 generic.go:334] "Generic (PLEG): container finished" podID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerID="295374791c43887f553c0032b6a4f6bb2a9d05879bb74f5633032af93b66511f" exitCode=143 Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.641798 4869 generic.go:334] "Generic (PLEG): container finished" podID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerID="7816ce0a15d7c39b2a7aab31b12c49bf3c956380c139ce57b0f6abe2c0fffc74" exitCode=143 Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.641797 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerDied","Data":"295374791c43887f553c0032b6a4f6bb2a9d05879bb74f5633032af93b66511f"} Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.641863 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerDied","Data":"7816ce0a15d7c39b2a7aab31b12c49bf3c956380c139ce57b0f6abe2c0fffc74"} Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.648642 4869 generic.go:334] "Generic (PLEG): container finished" podID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerID="e9aa428c4be69c0e0b15838600b79a6b13e4ab9a1ec45b395bbbfc917ff1d817" exitCode=143 Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.648670 4869 generic.go:334] "Generic (PLEG): container finished" podID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerID="5307b8bc58eae29370fac3f174b160e2bcf3d4deed155a2c7de5b81bb8e3dd8f" exitCode=143 Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.648695 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerDied","Data":"e9aa428c4be69c0e0b15838600b79a6b13e4ab9a1ec45b395bbbfc917ff1d817"} Feb 18 06:05:32 crc kubenswrapper[4869]: I0218 06:05:32.648723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerDied","Data":"5307b8bc58eae29370fac3f174b160e2bcf3d4deed155a2c7de5b81bb8e3dd8f"} Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.405387 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.437401 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.438807 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.442071 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507429 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rslv8\" (UniqueName: \"kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507522 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507581 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507604 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.507618 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.510636 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.533951 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.559953 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.571339 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57f5fddd88-qhh5n"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.572943 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.589538 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f5fddd88-qhh5n"] Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.611389 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rslv8\" (UniqueName: \"kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612024 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-combined-ca-bundle\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612176 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612218 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391d8fe4-58ea-434e-918f-811b7c3e14b2-logs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612310 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-secret-key\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612421 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612454 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612474 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-scripts\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612502 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612523 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612560 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7kw\" (UniqueName: \"kubernetes.io/projected/391d8fe4-58ea-434e-918f-811b7c3e14b2-kube-api-access-2k7kw\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612597 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-tls-certs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612621 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.612651 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-config-data\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.613661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.613936 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.617595 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.619883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.629397 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rslv8\" (UniqueName: \"kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.635607 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.636909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs\") pod \"horizon-69d999cf4d-drf2r\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714606 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-tls-certs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714685 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-config-data\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714827 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-combined-ca-bundle\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391d8fe4-58ea-434e-918f-811b7c3e14b2-logs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714893 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-secret-key\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714921 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-scripts\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.714962 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7kw\" (UniqueName: \"kubernetes.io/projected/391d8fe4-58ea-434e-918f-811b7c3e14b2-kube-api-access-2k7kw\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.715694 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391d8fe4-58ea-434e-918f-811b7c3e14b2-logs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.716721 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-scripts\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.718175 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391d8fe4-58ea-434e-918f-811b7c3e14b2-config-data\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.719690 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-tls-certs\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.719999 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-combined-ca-bundle\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.720883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/391d8fe4-58ea-434e-918f-811b7c3e14b2-horizon-secret-key\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.733195 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7kw\" (UniqueName: \"kubernetes.io/projected/391d8fe4-58ea-434e-918f-811b7c3e14b2-kube-api-access-2k7kw\") pod \"horizon-57f5fddd88-qhh5n\" (UID: \"391d8fe4-58ea-434e-918f-811b7c3e14b2\") " pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.763536 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:05:34 crc kubenswrapper[4869]: I0218 06:05:34.894021 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:05:36 crc kubenswrapper[4869]: I0218 06:05:36.492952 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:05:36 crc kubenswrapper[4869]: I0218 06:05:36.556735 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:36 crc kubenswrapper[4869]: I0218 06:05:36.557007 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" containerID="cri-o://bdd6ea05c4e5389753594030a5481a76090d9fb7300755ec472f58e94a940d5a" gracePeriod=10 Feb 18 06:05:36 crc kubenswrapper[4869]: I0218 06:05:36.807813 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Feb 18 06:05:37 crc kubenswrapper[4869]: I0218 06:05:37.704843 4869 generic.go:334] "Generic (PLEG): container finished" podID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerID="bdd6ea05c4e5389753594030a5481a76090d9fb7300755ec472f58e94a940d5a" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4869]: I0218 06:05:37.704898 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" event={"ID":"73514d80-ca11-468b-a6a0-4c3fffff8fea","Type":"ContainerDied","Data":"bdd6ea05c4e5389753594030a5481a76090d9fb7300755ec472f58e94a940d5a"} Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.713602 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.758359 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x946f" event={"ID":"5bb7043f-ce57-401f-844e-b96417b8e219","Type":"ContainerDied","Data":"70e06859a3f8c9663edad92b1ee44c375049f77195dd29959674b28f3705cce9"} Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.758398 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e06859a3f8c9663edad92b1ee44c375049f77195dd29959674b28f3705cce9" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.758433 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x946f" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.859241 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljtxm\" (UniqueName: \"kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.859418 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.859481 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.860457 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.860494 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.860563 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle\") pod \"5bb7043f-ce57-401f-844e-b96417b8e219\" (UID: \"5bb7043f-ce57-401f-844e-b96417b8e219\") " Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.866053 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.866231 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm" (OuterVolumeSpecName: "kube-api-access-ljtxm") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "kube-api-access-ljtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.868360 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.878144 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts" (OuterVolumeSpecName: "scripts") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.905868 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.912035 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data" (OuterVolumeSpecName: "config-data") pod "5bb7043f-ce57-401f-844e-b96417b8e219" (UID: "5bb7043f-ce57-401f-844e-b96417b8e219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.962643 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljtxm\" (UniqueName: \"kubernetes.io/projected/5bb7043f-ce57-401f-844e-b96417b8e219-kube-api-access-ljtxm\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.962958 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.962970 4869 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.962980 4869 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.962988 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4869]: I0218 06:05:41.963016 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7043f-ce57-401f-844e-b96417b8e219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4869]: I0218 06:05:42.896712 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x946f"] Feb 18 06:05:42 crc kubenswrapper[4869]: I0218 06:05:42.905381 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x946f"] Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.052641 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8wq6q"] Feb 18 06:05:43 crc kubenswrapper[4869]: E0218 06:05:43.053327 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb7043f-ce57-401f-844e-b96417b8e219" containerName="keystone-bootstrap" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.053349 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb7043f-ce57-401f-844e-b96417b8e219" containerName="keystone-bootstrap" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.053589 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb7043f-ce57-401f-844e-b96417b8e219" containerName="keystone-bootstrap" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.054698 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.056450 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.056475 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.056458 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h7tvj" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.058335 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.058900 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.061652 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8wq6q"] Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197212 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197285 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197310 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197367 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkpb\" (UniqueName: \"kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197416 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.197444 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: E0218 06:05:43.201548 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 18 06:05:43 crc kubenswrapper[4869]: E0218 06:05:43.201854 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swrdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-gk2zt_openstack(50b8c093-2eb4-4220-b335-b5b94fb8776e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:05:43 crc kubenswrapper[4869]: E0218 06:05:43.203219 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-gk2zt" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299164 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299224 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299250 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299329 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkpb\" (UniqueName: \"kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299398 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.299440 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.306948 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.306949 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.307369 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.309639 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.312149 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.327309 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkpb\" (UniqueName: \"kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb\") pod \"keystone-bootstrap-8wq6q\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.384627 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:05:43 crc kubenswrapper[4869]: I0218 06:05:43.480910 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb7043f-ce57-401f-844e-b96417b8e219" path="/var/lib/kubelet/pods/5bb7043f-ce57-401f-844e-b96417b8e219/volumes" Feb 18 06:05:43 crc kubenswrapper[4869]: E0218 06:05:43.775185 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-gk2zt" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" Feb 18 06:05:45 crc kubenswrapper[4869]: E0218 06:05:45.194379 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 06:05:45 crc kubenswrapper[4869]: E0218 06:05:45.195245 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56bh65bhdch688h699h65fh5dbh5ddh78h5b4h5b5h56fhcch685h5dh5fch7dh57bh7dh694h56fhc9h5b9hcchffhf7h699h98h74h77h644h5b6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdhsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-745b5b966f-7mdzf_openstack(8127655f-342a-4bfc-a5c3-0a44bfc3cb77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:05:45 crc kubenswrapper[4869]: E0218 06:05:45.197725 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-745b5b966f-7mdzf" podUID="8127655f-342a-4bfc-a5c3-0a44bfc3cb77" Feb 18 06:05:46 crc kubenswrapper[4869]: I0218 06:05:46.807116 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Feb 18 06:05:47 crc kubenswrapper[4869]: I0218 06:05:47.818986 4869 generic.go:334] "Generic (PLEG): container finished" podID="c414ef60-f94a-4047-ad26-b7ca6fa3f93b" containerID="daff6c6b03f528b83967bcbb8355a5fc18ffb7d072cccff4c7d7d4e3ba44ca7d" exitCode=0 Feb 18 06:05:47 crc kubenswrapper[4869]: I0218 06:05:47.819074 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7cn7l" event={"ID":"c414ef60-f94a-4047-ad26-b7ca6fa3f93b","Type":"ContainerDied","Data":"daff6c6b03f528b83967bcbb8355a5fc18ffb7d072cccff4c7d7d4e3ba44ca7d"} Feb 18 06:05:51 crc kubenswrapper[4869]: I0218 06:05:51.808069 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Feb 18 06:05:51 crc kubenswrapper[4869]: I0218 06:05:51.808984 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.679500 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.686875 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805595 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805639 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805681 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4fvm\" (UniqueName: \"kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805784 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805844 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805868 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805885 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805903 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805946 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.805969 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.806001 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgl24\" (UniqueName: \"kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.806041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run\") pod \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\" (UID: \"d0de7d58-05cc-44bb-93f7-d74404c4ea56\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.806068 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb\") pod \"73514d80-ca11-468b-a6a0-4c3fffff8fea\" (UID: \"73514d80-ca11-468b-a6a0-4c3fffff8fea\") " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.806763 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.806858 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs" (OuterVolumeSpecName: "logs") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812371 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24" (OuterVolumeSpecName: "kube-api-access-zgl24") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "kube-api-access-zgl24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812375 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812825 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812841 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgl24\" (UniqueName: \"kubernetes.io/projected/73514d80-ca11-468b-a6a0-4c3fffff8fea-kube-api-access-zgl24\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812851 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812860 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0de7d58-05cc-44bb-93f7-d74404c4ea56-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.812938 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm" (OuterVolumeSpecName: "kube-api-access-l4fvm") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "kube-api-access-l4fvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.816060 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts" (OuterVolumeSpecName: "scripts") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.833733 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.839261 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.855304 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.857814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.858844 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data" (OuterVolumeSpecName: "config-data") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.867636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" event={"ID":"73514d80-ca11-468b-a6a0-4c3fffff8fea","Type":"ContainerDied","Data":"5963833b063036660d5c36746557c78563fbf7abe2104a6b0b4be750f89976f9"} Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.867701 4869 scope.go:117] "RemoveContainer" containerID="bdd6ea05c4e5389753594030a5481a76090d9fb7300755ec472f58e94a940d5a" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.867866 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.871054 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0de7d58-05cc-44bb-93f7-d74404c4ea56","Type":"ContainerDied","Data":"f01002c44f133e802445e87c5691a78a90c57ba9cee6fe8242dee13bbf86092f"} Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.871175 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.875384 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config" (OuterVolumeSpecName: "config") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.876142 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d0de7d58-05cc-44bb-93f7-d74404c4ea56" (UID: "d0de7d58-05cc-44bb-93f7-d74404c4ea56"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.876470 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.878124 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73514d80-ca11-468b-a6a0-4c3fffff8fea" (UID: "73514d80-ca11-468b-a6a0-4c3fffff8fea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914420 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914462 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914479 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914494 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914508 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914523 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914533 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914544 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4fvm\" (UniqueName: \"kubernetes.io/projected/d0de7d58-05cc-44bb-93f7-d74404c4ea56-kube-api-access-l4fvm\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914554 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0de7d58-05cc-44bb-93f7-d74404c4ea56-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914566 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:52 crc kubenswrapper[4869]: I0218 06:05:52.914577 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73514d80-ca11-468b-a6a0-4c3fffff8fea-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.169843 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.170318 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hktd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zzcxf_openstack(ffdffd9a-f626-4bf2-b1e0-104eca55e7f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.171414 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zzcxf" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.172899 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.180159 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.190449 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.258204 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.266254 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g6q8b"] Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.287611 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.298477 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306475 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306892 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306912 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306926 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306932 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306943 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306950 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306965 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="init" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306971 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="init" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306984 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.306989 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.306998 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307004 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.307024 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c414ef60-f94a-4047-ad26-b7ca6fa3f93b" containerName="neutron-db-sync" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307030 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c414ef60-f94a-4047-ad26-b7ca6fa3f93b" containerName="neutron-db-sync" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307189 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c414ef60-f94a-4047-ad26-b7ca6fa3f93b" containerName="neutron-db-sync" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307203 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307209 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307218 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307233 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" containerName="glance-log" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.307240 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" containerName="glance-httpd" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.308110 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.311225 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.311597 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322628 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config\") pod \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322700 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data\") pod \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322776 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle\") pod \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322820 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv7zm\" (UniqueName: \"kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322883 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322928 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts\") pod \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322959 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs\") pod \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.322983 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323032 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323171 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323262 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkw8\" (UniqueName: \"kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8\") pod \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\" (UID: \"c414ef60-f94a-4047-ad26-b7ca6fa3f93b\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323303 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323327 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323636 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run\") pod \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\" (UID: \"49b497a5-fbf7-42a0-87c5-0edd1b788ee1\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323641 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data" (OuterVolumeSpecName: "config-data") pod "8127655f-342a-4bfc-a5c3-0a44bfc3cb77" (UID: "8127655f-342a-4bfc-a5c3-0a44bfc3cb77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323701 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key\") pod \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.323986 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhsb\" (UniqueName: \"kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb\") pod \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\" (UID: \"8127655f-342a-4bfc-a5c3-0a44bfc3cb77\") " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.324548 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.327282 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.327688 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb" (OuterVolumeSpecName: "kube-api-access-sdhsb") pod "8127655f-342a-4bfc-a5c3-0a44bfc3cb77" (UID: "8127655f-342a-4bfc-a5c3-0a44bfc3cb77"). InnerVolumeSpecName "kube-api-access-sdhsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.328293 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs" (OuterVolumeSpecName: "logs") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.328806 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts" (OuterVolumeSpecName: "scripts") pod "8127655f-342a-4bfc-a5c3-0a44bfc3cb77" (UID: "8127655f-342a-4bfc-a5c3-0a44bfc3cb77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.330042 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs" (OuterVolumeSpecName: "logs") pod "8127655f-342a-4bfc-a5c3-0a44bfc3cb77" (UID: "8127655f-342a-4bfc-a5c3-0a44bfc3cb77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.330120 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8127655f-342a-4bfc-a5c3-0a44bfc3cb77" (UID: "8127655f-342a-4bfc-a5c3-0a44bfc3cb77"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.330138 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.331392 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.333785 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm" (OuterVolumeSpecName: "kube-api-access-pv7zm") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "kube-api-access-pv7zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.335436 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8" (OuterVolumeSpecName: "kube-api-access-nlkw8") pod "c414ef60-f94a-4047-ad26-b7ca6fa3f93b" (UID: "c414ef60-f94a-4047-ad26-b7ca6fa3f93b"). InnerVolumeSpecName "kube-api-access-nlkw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.350208 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts" (OuterVolumeSpecName: "scripts") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.365590 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config" (OuterVolumeSpecName: "config") pod "c414ef60-f94a-4047-ad26-b7ca6fa3f93b" (UID: "c414ef60-f94a-4047-ad26-b7ca6fa3f93b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.373562 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c414ef60-f94a-4047-ad26-b7ca6fa3f93b" (UID: "c414ef60-f94a-4047-ad26-b7ca6fa3f93b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.382046 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.382994 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data" (OuterVolumeSpecName: "config-data") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.394281 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "49b497a5-fbf7-42a0-87c5-0edd1b788ee1" (UID: "49b497a5-fbf7-42a0-87c5-0edd1b788ee1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rms\" (UniqueName: \"kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425781 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425828 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425881 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425919 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.425983 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426025 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426248 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426280 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426293 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426321 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426334 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426349 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426363 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkw8\" (UniqueName: \"kubernetes.io/projected/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-kube-api-access-nlkw8\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426377 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426390 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426402 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426411 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426422 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhsb\" (UniqueName: \"kubernetes.io/projected/8127655f-342a-4bfc-a5c3-0a44bfc3cb77-kube-api-access-sdhsb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426431 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426440 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414ef60-f94a-4047-ad26-b7ca6fa3f93b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.426448 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv7zm\" (UniqueName: \"kubernetes.io/projected/49b497a5-fbf7-42a0-87c5-0edd1b788ee1-kube-api-access-pv7zm\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.445004 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.491852 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" path="/var/lib/kubelet/pods/73514d80-ca11-468b-a6a0-4c3fffff8fea/volumes" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.492499 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0de7d58-05cc-44bb-93f7-d74404c4ea56" path="/var/lib/kubelet/pods/d0de7d58-05cc-44bb-93f7-d74404c4ea56/volumes" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rms\" (UniqueName: \"kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528077 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528120 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528155 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528184 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528208 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528239 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528272 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528327 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528371 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.528700 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.530113 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.532098 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.532627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.532661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.533407 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.546178 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rms\" (UniqueName: \"kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.563139 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.636399 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.881362 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7cn7l" event={"ID":"c414ef60-f94a-4047-ad26-b7ca6fa3f93b","Type":"ContainerDied","Data":"93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5"} Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.881449 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93cdc1a2c705631058497960e54db5e21654d74d925b2dae5a82104c147c87d5" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.881391 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7cn7l" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.885037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49b497a5-fbf7-42a0-87c5-0edd1b788ee1","Type":"ContainerDied","Data":"5d5f40ef4d50fa74f787b6d6b40be9947bc5f65b824c7481b2689ffc369e9b74"} Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.885067 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.887797 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b5b966f-7mdzf" Feb 18 06:05:53 crc kubenswrapper[4869]: I0218 06:05:53.888637 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745b5b966f-7mdzf" event={"ID":"8127655f-342a-4bfc-a5c3-0a44bfc3cb77","Type":"ContainerDied","Data":"1667417614db1ddb3113785522c046ad2859005f5c04d8694ff6cd4bcf191787"} Feb 18 06:05:53 crc kubenswrapper[4869]: E0218 06:05:53.889828 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-zzcxf" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.003618 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.010354 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-745b5b966f-7mdzf"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.020129 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.029271 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.036712 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.038555 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.040428 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.040642 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.043010 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146569 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146660 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146683 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7jc7\" (UniqueName: \"kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146708 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146773 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146820 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146847 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.146874 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.248783 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.248831 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7jc7\" (UniqueName: \"kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.248862 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.248906 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.248979 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.249387 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.249589 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.249840 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.250008 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.250085 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.250225 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.254099 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.256861 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.257501 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.258790 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.266676 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7jc7\" (UniqueName: \"kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.280362 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.361677 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.459709 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.462085 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.465023 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954594 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954696 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954766 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954813 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954861 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgxpp\" (UniqueName: \"kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:54 crc kubenswrapper[4869]: I0218 06:05:54.954924 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056192 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056260 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056503 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgxpp\" (UniqueName: \"kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056575 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056647 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.056687 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.057476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.059668 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.060233 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.061382 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.061667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.061818 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.065491 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.065820 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.065964 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkh8t" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.066133 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.079297 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgxpp\" (UniqueName: \"kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp\") pod \"dnsmasq-dns-55f844cf75-7vv8w\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.080324 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.082639 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.260257 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.260314 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.260411 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.260649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.260772 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7m6\" (UniqueName: \"kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.363026 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.363175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7m6\" (UniqueName: \"kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.363397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.363465 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.363526 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.367841 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.368589 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.375470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.375550 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.381390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7m6\" (UniqueName: \"kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6\") pod \"neutron-5b5d66448d-sbn85\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: E0218 06:05:55.393962 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 18 06:05:55 crc kubenswrapper[4869]: E0218 06:05:55.394203 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-8v5fn_openstack(77d2d3cf-1108-468b-816a-64d29471542e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:05:55 crc kubenswrapper[4869]: E0218 06:05:55.396112 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-8v5fn" podUID="77d2d3cf-1108-468b-816a-64d29471542e" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.404196 4869 scope.go:117] "RemoveContainer" containerID="ce12f91d6c365f13573c7ef014d966a63cf08797d1bd54a54585de51bdc7653c" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.464761 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.535525 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b497a5-fbf7-42a0-87c5-0edd1b788ee1" path="/var/lib/kubelet/pods/49b497a5-fbf7-42a0-87c5-0edd1b788ee1/volumes" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.536548 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8127655f-342a-4bfc-a5c3-0a44bfc3cb77" path="/var/lib/kubelet/pods/8127655f-342a-4bfc-a5c3-0a44bfc3cb77/volumes" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.592699 4869 scope.go:117] "RemoveContainer" containerID="295374791c43887f553c0032b6a4f6bb2a9d05879bb74f5633032af93b66511f" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.653587 4869 scope.go:117] "RemoveContainer" containerID="7816ce0a15d7c39b2a7aab31b12c49bf3c956380c139ce57b0f6abe2c0fffc74" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.696417 4869 scope.go:117] "RemoveContainer" containerID="e9aa428c4be69c0e0b15838600b79a6b13e4ab9a1ec45b395bbbfc917ff1d817" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.762200 4869 scope.go:117] "RemoveContainer" containerID="5307b8bc58eae29370fac3f174b160e2bcf3d4deed155a2c7de5b81bb8e3dd8f" Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.874077 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:05:55 crc kubenswrapper[4869]: I0218 06:05:55.991718 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f5fddd88-qhh5n"] Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.012885 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerStarted","Data":"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0"} Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.031971 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerStarted","Data":"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210"} Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.034403 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerStarted","Data":"f09f7ddec431acb4437d2f6a245a598f31fbdd07457ae52f0d2bddb63ae30624"} Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.050349 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk2zt" event={"ID":"50b8c093-2eb4-4220-b335-b5b94fb8776e","Type":"ContainerStarted","Data":"e7fe756e70be2b967d5c71dcb58200ea0cc65a7f24d645a4c839f706c050a2df"} Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.058154 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerStarted","Data":"cc1e9d5ba551f30508b9c6b39a2577638a716d09b551509471801f38783ba271"} Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.084900 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gk2zt" podStartSLOduration=3.551470041 podStartE2EDuration="31.084876392s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="2026-02-18 06:05:28.049598986 +0000 UTC m=+1025.218687218" lastFinishedPulling="2026-02-18 06:05:55.583005337 +0000 UTC m=+1052.752093569" observedRunningTime="2026-02-18 06:05:56.072708126 +0000 UTC m=+1053.241796358" watchObservedRunningTime="2026-02-18 06:05:56.084876392 +0000 UTC m=+1053.253964624" Feb 18 06:05:56 crc kubenswrapper[4869]: E0218 06:05:56.085866 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-8v5fn" podUID="77d2d3cf-1108-468b-816a-64d29471542e" Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.127166 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.155918 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8wq6q"] Feb 18 06:05:56 crc kubenswrapper[4869]: W0218 06:05:56.191508 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1467815e_0912_4dc2_b87d_4cab891b93b2.slice/crio-27eb9e1c9d3f083d7817ef7b0b61d41a107254768860c4fdd14c12d72e2d2420 WatchSource:0}: Error finding container 27eb9e1c9d3f083d7817ef7b0b61d41a107254768860c4fdd14c12d72e2d2420: Status 404 returned error can't find the container with id 27eb9e1c9d3f083d7817ef7b0b61d41a107254768860c4fdd14c12d72e2d2420 Feb 18 06:05:56 crc kubenswrapper[4869]: W0218 06:05:56.208528 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc953d279_bde4_4a98_87d5_3cbcaefc875c.slice/crio-6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b WatchSource:0}: Error finding container 6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b: Status 404 returned error can't find the container with id 6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.405837 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:56 crc kubenswrapper[4869]: W0218 06:05:56.421724 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88723829_b0c8_4bc7_92fc_63f9767ff69c.slice/crio-226eef74f20f961341d7f242d8464557c1ba70f0448aee436e751beaba7c35a5 WatchSource:0}: Error finding container 226eef74f20f961341d7f242d8464557c1ba70f0448aee436e751beaba7c35a5: Status 404 returned error can't find the container with id 226eef74f20f961341d7f242d8464557c1ba70f0448aee436e751beaba7c35a5 Feb 18 06:05:56 crc kubenswrapper[4869]: I0218 06:05:56.809384 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g6q8b" podUID="73514d80-ca11-468b-a6a0-4c3fffff8fea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.081572 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerStarted","Data":"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.081627 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerStarted","Data":"226eef74f20f961341d7f242d8464557c1ba70f0448aee436e751beaba7c35a5"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.100219 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerStarted","Data":"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.100334 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-588975f6bf-9447r" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon-log" containerID="cri-o://133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210" gracePeriod=30 Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.100427 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-588975f6bf-9447r" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon" containerID="cri-o://0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1" gracePeriod=30 Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.105304 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerStarted","Data":"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.105352 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerStarted","Data":"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.115123 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8wq6q" event={"ID":"c953d279-bde4-4a98-87d5-3cbcaefc875c","Type":"ContainerStarted","Data":"accb33c9fcc62c78ffa3fb316288bd7d9ed68be05f9e7fc595f6cbbea44f8ced"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.115156 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8wq6q" event={"ID":"c953d279-bde4-4a98-87d5-3cbcaefc875c","Type":"ContainerStarted","Data":"6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.137438 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-588975f6bf-9447r" podStartSLOduration=2.760051199 podStartE2EDuration="28.137413545s" podCreationTimestamp="2026-02-18 06:05:29 +0000 UTC" firstStartedPulling="2026-02-18 06:05:30.041595255 +0000 UTC m=+1027.210683487" lastFinishedPulling="2026-02-18 06:05:55.418957601 +0000 UTC m=+1052.588045833" observedRunningTime="2026-02-18 06:05:57.132924106 +0000 UTC m=+1054.302012348" watchObservedRunningTime="2026-02-18 06:05:57.137413545 +0000 UTC m=+1054.306501777" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.138369 4869 generic.go:334] "Generic (PLEG): container finished" podID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerID="1a6299fde2a547fc6e0eff4cfb840ad831bec2b12166e35da121687af42e739d" exitCode=0 Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.138474 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" event={"ID":"1467815e-0912-4dc2-b87d-4cab891b93b2","Type":"ContainerDied","Data":"1a6299fde2a547fc6e0eff4cfb840ad831bec2b12166e35da121687af42e739d"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.138505 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" event={"ID":"1467815e-0912-4dc2-b87d-4cab891b93b2","Type":"ContainerStarted","Data":"27eb9e1c9d3f083d7817ef7b0b61d41a107254768860c4fdd14c12d72e2d2420"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.156087 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69d999cf4d-drf2r" podStartSLOduration=23.156070449 podStartE2EDuration="23.156070449s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:57.15529783 +0000 UTC m=+1054.324386072" watchObservedRunningTime="2026-02-18 06:05:57.156070449 +0000 UTC m=+1054.325158681" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.167286 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerStarted","Data":"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.167311 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76444b47f5-mc7kc" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon-log" containerID="cri-o://6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" gracePeriod=30 Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.167421 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76444b47f5-mc7kc" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon" containerID="cri-o://aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" gracePeriod=30 Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.179020 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f5fddd88-qhh5n" event={"ID":"391d8fe4-58ea-434e-918f-811b7c3e14b2","Type":"ContainerStarted","Data":"8b0ebff3335624c2ebd29583474dfd30fbd6182817ab401660fdecc5e05b280b"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.179061 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f5fddd88-qhh5n" event={"ID":"391d8fe4-58ea-434e-918f-811b7c3e14b2","Type":"ContainerStarted","Data":"7696d250a140db027363e86466070c8f9eaad0653c96c69d0b995c6b5097f0ee"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.179073 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f5fddd88-qhh5n" event={"ID":"391d8fe4-58ea-434e-918f-811b7c3e14b2","Type":"ContainerStarted","Data":"12e6b5c26a214fafdc3c68ad3ef225453e9c757e266d41cc30364b24b597cefb"} Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.189410 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8wq6q" podStartSLOduration=15.189381828 podStartE2EDuration="15.189381828s" podCreationTimestamp="2026-02-18 06:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:57.174010355 +0000 UTC m=+1054.343098597" watchObservedRunningTime="2026-02-18 06:05:57.189381828 +0000 UTC m=+1054.358470060" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.215823 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76444b47f5-mc7kc" podStartSLOduration=7.516392687 podStartE2EDuration="32.21580805s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="2026-02-18 06:05:28.466550266 +0000 UTC m=+1025.635638498" lastFinishedPulling="2026-02-18 06:05:53.165965629 +0000 UTC m=+1050.335053861" observedRunningTime="2026-02-18 06:05:57.195049246 +0000 UTC m=+1054.364137478" watchObservedRunningTime="2026-02-18 06:05:57.21580805 +0000 UTC m=+1054.384896282" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.248426 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57f5fddd88-qhh5n" podStartSLOduration=23.248406962 podStartE2EDuration="23.248406962s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:57.24131438 +0000 UTC m=+1054.410402612" watchObservedRunningTime="2026-02-18 06:05:57.248406962 +0000 UTC m=+1054.417495194" Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.438622 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:05:57 crc kubenswrapper[4869]: I0218 06:05:57.536347 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.224018 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" event={"ID":"1467815e-0912-4dc2-b87d-4cab891b93b2","Type":"ContainerStarted","Data":"36fdbafff93225d72b82cafe24fdc9a0997a1ed8ebf247394f3ce2dfb7a47b77"} Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.224846 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.233719 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerStarted","Data":"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483"} Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.233788 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerStarted","Data":"c66984bbffcb6467dfd99cd0bf446e5694703e67cc795e3464dde67788fd6d63"} Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.261064 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerStarted","Data":"2a91f1c5803221d9a5a58e7af6112f7172361e73d3504f3ad7f545af34a9a798"} Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.269186 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" podStartSLOduration=4.269163994 podStartE2EDuration="4.269163994s" podCreationTimestamp="2026-02-18 06:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:58.266639703 +0000 UTC m=+1055.435727935" watchObservedRunningTime="2026-02-18 06:05:58.269163994 +0000 UTC m=+1055.438252226" Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.979845 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.983358 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.987268 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.987983 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 18 06:05:58 crc kubenswrapper[4869]: I0218 06:05:58.995898 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.158523 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.158584 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.158604 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcv7\" (UniqueName: \"kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.159027 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.159080 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.159161 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.159261 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.261706 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.261838 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.261906 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.261926 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.261991 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.262026 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcv7\" (UniqueName: \"kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.262066 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.269593 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.270040 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.270818 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.272429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.276125 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.278482 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.281400 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcv7\" (UniqueName: \"kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7\") pod \"neutron-5466c5cc6f-xd5xf\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.312112 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.347365 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerStarted","Data":"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff"} Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.359978 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerStarted","Data":"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd"} Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.360688 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.369009 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.368991847 podStartE2EDuration="6.368991847s" podCreationTimestamp="2026-02-18 06:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:59.36660907 +0000 UTC m=+1056.535697302" watchObservedRunningTime="2026-02-18 06:05:59.368991847 +0000 UTC m=+1056.538080079" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.372402 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerStarted","Data":"3b4990a9258d9c42d96c115fe51582a869ae9a39017962bb7dc0b390043f3429"} Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.374105 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerStarted","Data":"ddd99ad38a349df8ed1aa180885bb67905bce7cc10ac1040ed3d5c8c37bc13b5"} Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.412421 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b5d66448d-sbn85" podStartSLOduration=4.412381771 podStartE2EDuration="4.412381771s" podCreationTimestamp="2026-02-18 06:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:59.391189546 +0000 UTC m=+1056.560277778" watchObservedRunningTime="2026-02-18 06:05:59.412381771 +0000 UTC m=+1056.581470003" Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.414021 4869 generic.go:334] "Generic (PLEG): container finished" podID="50b8c093-2eb4-4220-b335-b5b94fb8776e" containerID="e7fe756e70be2b967d5c71dcb58200ea0cc65a7f24d645a4c839f706c050a2df" exitCode=0 Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.415023 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk2zt" event={"ID":"50b8c093-2eb4-4220-b335-b5b94fb8776e","Type":"ContainerDied","Data":"e7fe756e70be2b967d5c71dcb58200ea0cc65a7f24d645a4c839f706c050a2df"} Feb 18 06:05:59 crc kubenswrapper[4869]: I0218 06:05:59.422898 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.029711 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.435221 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerStarted","Data":"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287"} Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.435596 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerStarted","Data":"54f75c82a3746f651d157cf929936ad5c90114789f51a5d0745301c521c51428"} Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.443817 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerStarted","Data":"4e71bf1b8b96f550532c65e6aae3ca3b030455e41b574f8de2ada1855c213874"} Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.472600 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.472577171 podStartE2EDuration="7.472577171s" podCreationTimestamp="2026-02-18 06:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:00.463164383 +0000 UTC m=+1057.632252635" watchObservedRunningTime="2026-02-18 06:06:00.472577171 +0000 UTC m=+1057.641665403" Feb 18 06:06:00 crc kubenswrapper[4869]: I0218 06:06:00.866791 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk2zt" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.001844 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts\") pod \"50b8c093-2eb4-4220-b335-b5b94fb8776e\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.002034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs\") pod \"50b8c093-2eb4-4220-b335-b5b94fb8776e\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.002109 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data\") pod \"50b8c093-2eb4-4220-b335-b5b94fb8776e\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.002198 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swrdw\" (UniqueName: \"kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw\") pod \"50b8c093-2eb4-4220-b335-b5b94fb8776e\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.002252 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle\") pod \"50b8c093-2eb4-4220-b335-b5b94fb8776e\" (UID: \"50b8c093-2eb4-4220-b335-b5b94fb8776e\") " Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.004905 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs" (OuterVolumeSpecName: "logs") pod "50b8c093-2eb4-4220-b335-b5b94fb8776e" (UID: "50b8c093-2eb4-4220-b335-b5b94fb8776e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.007829 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts" (OuterVolumeSpecName: "scripts") pod "50b8c093-2eb4-4220-b335-b5b94fb8776e" (UID: "50b8c093-2eb4-4220-b335-b5b94fb8776e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.009130 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw" (OuterVolumeSpecName: "kube-api-access-swrdw") pod "50b8c093-2eb4-4220-b335-b5b94fb8776e" (UID: "50b8c093-2eb4-4220-b335-b5b94fb8776e"). InnerVolumeSpecName "kube-api-access-swrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.036731 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b8c093-2eb4-4220-b335-b5b94fb8776e" (UID: "50b8c093-2eb4-4220-b335-b5b94fb8776e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.042254 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data" (OuterVolumeSpecName: "config-data") pod "50b8c093-2eb4-4220-b335-b5b94fb8776e" (UID: "50b8c093-2eb4-4220-b335-b5b94fb8776e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.106019 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b8c093-2eb4-4220-b335-b5b94fb8776e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.106060 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.106074 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swrdw\" (UniqueName: \"kubernetes.io/projected/50b8c093-2eb4-4220-b335-b5b94fb8776e-kube-api-access-swrdw\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.106087 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.106096 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b8c093-2eb4-4220-b335-b5b94fb8776e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.455342 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk2zt" event={"ID":"50b8c093-2eb4-4220-b335-b5b94fb8776e","Type":"ContainerDied","Data":"232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99"} Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.455399 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="232fee36f86a085fa302a1b50fdcbc82d09b4a4f40df44365ab7bfb5be80bc99" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.455358 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk2zt" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.464959 4869 generic.go:334] "Generic (PLEG): container finished" podID="c953d279-bde4-4a98-87d5-3cbcaefc875c" containerID="accb33c9fcc62c78ffa3fb316288bd7d9ed68be05f9e7fc595f6cbbea44f8ced" exitCode=0 Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.465040 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8wq6q" event={"ID":"c953d279-bde4-4a98-87d5-3cbcaefc875c","Type":"ContainerDied","Data":"accb33c9fcc62c78ffa3fb316288bd7d9ed68be05f9e7fc595f6cbbea44f8ced"} Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.539383 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5466c5cc6f-xd5xf" podStartSLOduration=3.539355191 podStartE2EDuration="3.539355191s" podCreationTimestamp="2026-02-18 06:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:01.518971946 +0000 UTC m=+1058.688060178" watchObservedRunningTime="2026-02-18 06:06:01.539355191 +0000 UTC m=+1058.708443423" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.547371 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerStarted","Data":"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089"} Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.547689 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.570891 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:01 crc kubenswrapper[4869]: E0218 06:06:01.571341 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" containerName="placement-db-sync" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.571365 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" containerName="placement-db-sync" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.571540 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" containerName="placement-db-sync" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.572572 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.576029 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.576194 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.576340 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fnpts" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.576483 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.577580 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.606879 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728021 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728090 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728156 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728185 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728207 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728530 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.728885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4nj\" (UniqueName: \"kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4nj\" (UniqueName: \"kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831306 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831345 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831378 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831407 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.831919 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.842292 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.847230 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.854413 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.859410 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.866403 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.867301 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4nj\" (UniqueName: \"kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj\") pod \"placement-6558576dd4-9gx96\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:01 crc kubenswrapper[4869]: I0218 06:06:01.907476 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.513335 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.916921 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956098 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956171 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956275 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956387 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnkpb\" (UniqueName: \"kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956469 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.956493 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data\") pod \"c953d279-bde4-4a98-87d5-3cbcaefc875c\" (UID: \"c953d279-bde4-4a98-87d5-3cbcaefc875c\") " Feb 18 06:06:02 crc kubenswrapper[4869]: I0218 06:06:02.997594 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.000215 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts" (OuterVolumeSpecName: "scripts") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.001084 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.004072 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb" (OuterVolumeSpecName: "kube-api-access-fnkpb") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "kube-api-access-fnkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.005866 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.014012 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data" (OuterVolumeSpecName: "config-data") pod "c953d279-bde4-4a98-87d5-3cbcaefc875c" (UID: "c953d279-bde4-4a98-87d5-3cbcaefc875c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.057957 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnkpb\" (UniqueName: \"kubernetes.io/projected/c953d279-bde4-4a98-87d5-3cbcaefc875c-kube-api-access-fnkpb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.057993 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.058002 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.058011 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.058019 4869 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.058029 4869 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c953d279-bde4-4a98-87d5-3cbcaefc875c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.541089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerStarted","Data":"24bd6d819bf8cd5bfda9ebdaec36688778e6e8216b265ef48658d29cfda6cb60"} Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.542939 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerStarted","Data":"108392cc67f37203d82b4e56175a13dcda0e43b7e69dbbf296cf6108030f4cfb"} Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.551192 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8wq6q" event={"ID":"c953d279-bde4-4a98-87d5-3cbcaefc875c","Type":"ContainerDied","Data":"6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b"} Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.551248 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6781803eb6b46d0a6fec1f481af5e3c4082d21940be0f54a0fcf431cd054a92b" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.551328 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8wq6q" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.580545 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-666896fcd4-c65vb"] Feb 18 06:06:03 crc kubenswrapper[4869]: E0218 06:06:03.581005 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c953d279-bde4-4a98-87d5-3cbcaefc875c" containerName="keystone-bootstrap" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.581020 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c953d279-bde4-4a98-87d5-3cbcaefc875c" containerName="keystone-bootstrap" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.583623 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c953d279-bde4-4a98-87d5-3cbcaefc875c" containerName="keystone-bootstrap" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.590962 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.594818 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.597633 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.605311 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-666896fcd4-c65vb"] Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.597975 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h7tvj" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.598311 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.598806 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.599393 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.636817 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.640022 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.669345 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-config-data\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-public-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670093 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd6q6\" (UniqueName: \"kubernetes.io/projected/e56418e8-0afb-47a6-9064-ff0a381ef2ba-kube-api-access-jd6q6\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670147 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-credential-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670212 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-fernet-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670283 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-internal-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670307 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-scripts\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.670369 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-combined-ca-bundle\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.689204 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.705237 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772055 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-combined-ca-bundle\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772137 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-config-data\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772177 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-public-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772226 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd6q6\" (UniqueName: \"kubernetes.io/projected/e56418e8-0afb-47a6-9064-ff0a381ef2ba-kube-api-access-jd6q6\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772266 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-credential-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-fernet-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772371 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-internal-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.772397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-scripts\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.778459 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-scripts\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.778858 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-fernet-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.780190 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-combined-ca-bundle\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.781864 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-config-data\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.782128 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-internal-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.783860 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-credential-keys\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.791357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56418e8-0afb-47a6-9064-ff0a381ef2ba-public-tls-certs\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.801063 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd6q6\" (UniqueName: \"kubernetes.io/projected/e56418e8-0afb-47a6-9064-ff0a381ef2ba-kube-api-access-jd6q6\") pod \"keystone-666896fcd4-c65vb\" (UID: \"e56418e8-0afb-47a6-9064-ff0a381ef2ba\") " pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:03 crc kubenswrapper[4869]: I0218 06:06:03.957210 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.362658 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.362707 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.458988 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.480710 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.560465 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.560500 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.560510 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.560518 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.764024 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.764072 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.894419 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:06:04 crc kubenswrapper[4869]: I0218 06:06:04.894470 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:06:05 crc kubenswrapper[4869]: I0218 06:06:05.084691 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:06:05 crc kubenswrapper[4869]: I0218 06:06:05.159062 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:06:05 crc kubenswrapper[4869]: I0218 06:06:05.159303 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="dnsmasq-dns" containerID="cri-o://f6fba7a1a763e656bb1ccede00c69b1cdeb2d81601366d2e062fa7b6a19e8a77" gracePeriod=10 Feb 18 06:06:05 crc kubenswrapper[4869]: I0218 06:06:05.574178 4869 generic.go:334] "Generic (PLEG): container finished" podID="89712bb0-67ea-4538-9b51-82a5b629c048" containerID="f6fba7a1a763e656bb1ccede00c69b1cdeb2d81601366d2e062fa7b6a19e8a77" exitCode=0 Feb 18 06:06:05 crc kubenswrapper[4869]: I0218 06:06:05.574259 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" event={"ID":"89712bb0-67ea-4538-9b51-82a5b629c048","Type":"ContainerDied","Data":"f6fba7a1a763e656bb1ccede00c69b1cdeb2d81601366d2e062fa7b6a19e8a77"} Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.213766 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55b8768b96-mwv6g"] Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.215401 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.244132 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b8768b96-mwv6g"] Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.326183 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340721 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-combined-ca-bundle\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340807 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-internal-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340837 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlbn\" (UniqueName: \"kubernetes.io/projected/965b4f29-cb41-4066-a9e6-3729ec43b2bd-kube-api-access-fjlbn\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-config-data\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340960 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-public-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.340991 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-scripts\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.341009 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965b4f29-cb41-4066-a9e6-3729ec43b2bd-logs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442349 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-scripts\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442396 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965b4f29-cb41-4066-a9e6-3729ec43b2bd-logs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442479 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-combined-ca-bundle\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442509 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-internal-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442535 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlbn\" (UniqueName: \"kubernetes.io/projected/965b4f29-cb41-4066-a9e6-3729ec43b2bd-kube-api-access-fjlbn\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-config-data\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.442612 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-public-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.443501 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965b4f29-cb41-4066-a9e6-3729ec43b2bd-logs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.453664 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-public-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.454665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-scripts\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.454939 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-internal-tls-certs\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.456432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-config-data\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.458346 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965b4f29-cb41-4066-a9e6-3729ec43b2bd-combined-ca-bundle\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.465307 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlbn\" (UniqueName: \"kubernetes.io/projected/965b4f29-cb41-4066-a9e6-3729ec43b2bd-kube-api-access-fjlbn\") pod \"placement-55b8768b96-mwv6g\" (UID: \"965b4f29-cb41-4066-a9e6-3729ec43b2bd\") " pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.492154 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.541175 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.595834 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.595864 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.596645 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:06 crc kubenswrapper[4869]: I0218 06:06:06.596664 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:07 crc kubenswrapper[4869]: I0218 06:06:07.038618 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:06:07 crc kubenswrapper[4869]: I0218 06:06:07.311547 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:06:07 crc kubenswrapper[4869]: I0218 06:06:07.571675 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:07 crc kubenswrapper[4869]: I0218 06:06:07.606384 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:08 crc kubenswrapper[4869]: I0218 06:06:08.064483 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:10 crc kubenswrapper[4869]: I0218 06:06:10.780342 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.060855 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brdhp\" (UniqueName: \"kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.061315 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.061398 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.061447 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.061492 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.061510 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb\") pod \"89712bb0-67ea-4538-9b51-82a5b629c048\" (UID: \"89712bb0-67ea-4538-9b51-82a5b629c048\") " Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.072925 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp" (OuterVolumeSpecName: "kube-api-access-brdhp") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "kube-api-access-brdhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.127529 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.144515 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.156919 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.164436 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brdhp\" (UniqueName: \"kubernetes.io/projected/89712bb0-67ea-4538-9b51-82a5b629c048-kube-api-access-brdhp\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.164466 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.164476 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.164487 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.169196 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.187528 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config" (OuterVolumeSpecName: "config") pod "89712bb0-67ea-4538-9b51-82a5b629c048" (UID: "89712bb0-67ea-4538-9b51-82a5b629c048"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.265922 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.265980 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89712bb0-67ea-4538-9b51-82a5b629c048-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.524249 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b8768b96-mwv6g"] Feb 18 06:06:11 crc kubenswrapper[4869]: W0218 06:06:11.541193 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod965b4f29_cb41_4066_a9e6_3729ec43b2bd.slice/crio-428e1138dfa4e99ba9e05a20b33ca3cc8534fa9294e8c544fe1e4c793ab6359b WatchSource:0}: Error finding container 428e1138dfa4e99ba9e05a20b33ca3cc8534fa9294e8c544fe1e4c793ab6359b: Status 404 returned error can't find the container with id 428e1138dfa4e99ba9e05a20b33ca3cc8534fa9294e8c544fe1e4c793ab6359b Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.667947 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-666896fcd4-c65vb"] Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.678902 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b8768b96-mwv6g" event={"ID":"965b4f29-cb41-4066-a9e6-3729ec43b2bd","Type":"ContainerStarted","Data":"428e1138dfa4e99ba9e05a20b33ca3cc8534fa9294e8c544fe1e4c793ab6359b"} Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.688472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" event={"ID":"89712bb0-67ea-4538-9b51-82a5b629c048","Type":"ContainerDied","Data":"54562d11f55656054de8c934b1115eb1ccf080952299f5e65a78355e66d4918b"} Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.688527 4869 scope.go:117] "RemoveContainer" containerID="f6fba7a1a763e656bb1ccede00c69b1cdeb2d81601366d2e062fa7b6a19e8a77" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.688613 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pnf76" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.697076 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerStarted","Data":"cb76054c6f93b947662ea5d20ca8b46ce0707c8f6eb72b79dfdf979cd0c1f034"} Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.705779 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerStarted","Data":"0610aa1c580eda1201bb0355e4a294c7c1ed25aed49c83ad5fd9185ce11a18b7"} Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.706951 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.707215 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.710415 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzcxf" event={"ID":"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5","Type":"ContainerStarted","Data":"56db69629c5551f72754a181b5e9e12c994e64a0bd704b08a7755e193c82ade4"} Feb 18 06:06:11 crc kubenswrapper[4869]: W0218 06:06:11.730602 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode56418e8_0afb_47a6_9064_ff0a381ef2ba.slice/crio-c60069c57340012f54d397a89d371d499fb3b76f275f904b61f843c7d680bbfc WatchSource:0}: Error finding container c60069c57340012f54d397a89d371d499fb3b76f275f904b61f843c7d680bbfc: Status 404 returned error can't find the container with id c60069c57340012f54d397a89d371d499fb3b76f275f904b61f843c7d680bbfc Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.732849 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6558576dd4-9gx96" podStartSLOduration=10.732834605 podStartE2EDuration="10.732834605s" podCreationTimestamp="2026-02-18 06:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:11.728964521 +0000 UTC m=+1068.898052753" watchObservedRunningTime="2026-02-18 06:06:11.732834605 +0000 UTC m=+1068.901922847" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.752618 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zzcxf" podStartSLOduration=3.083032274 podStartE2EDuration="45.752604005s" podCreationTimestamp="2026-02-18 06:05:26 +0000 UTC" firstStartedPulling="2026-02-18 06:05:28.472323326 +0000 UTC m=+1025.641411558" lastFinishedPulling="2026-02-18 06:06:11.141895057 +0000 UTC m=+1068.310983289" observedRunningTime="2026-02-18 06:06:11.752453202 +0000 UTC m=+1068.921541434" watchObservedRunningTime="2026-02-18 06:06:11.752604005 +0000 UTC m=+1068.921692237" Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.921290 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.941842 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pnf76"] Feb 18 06:06:11 crc kubenswrapper[4869]: I0218 06:06:11.945315 4869 scope.go:117] "RemoveContainer" containerID="b28392a3421514293f0ad34c0a53497dba87ca0d5e150417d6a07bc265c84277" Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.719471 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8v5fn" event={"ID":"77d2d3cf-1108-468b-816a-64d29471542e","Type":"ContainerStarted","Data":"5710b416d77ba4c1b05282bf3850529b3d884abdc7dc8b9bfb141a5daf9abf53"} Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.722389 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b8768b96-mwv6g" event={"ID":"965b4f29-cb41-4066-a9e6-3729ec43b2bd","Type":"ContainerStarted","Data":"807c48fce696894498afa8bc9852b773adbe91ff4486e42cbeba682f2d7e5aa3"} Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.722523 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b8768b96-mwv6g" event={"ID":"965b4f29-cb41-4066-a9e6-3729ec43b2bd","Type":"ContainerStarted","Data":"1638f9cad84ccf40b69c38585f7aed36025355d19dc6839a4d6224b012280281"} Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.723041 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.723136 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.726488 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-666896fcd4-c65vb" event={"ID":"e56418e8-0afb-47a6-9064-ff0a381ef2ba","Type":"ContainerStarted","Data":"315033e24b83839f7e55d72b84c935abe648cc46195610d66ccb526d28f16e94"} Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.726590 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.726661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-666896fcd4-c65vb" event={"ID":"e56418e8-0afb-47a6-9064-ff0a381ef2ba","Type":"ContainerStarted","Data":"c60069c57340012f54d397a89d371d499fb3b76f275f904b61f843c7d680bbfc"} Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.747277 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8v5fn" podStartSLOduration=3.847513293 podStartE2EDuration="47.747259383s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="2026-02-18 06:05:27.243987151 +0000 UTC m=+1024.413075383" lastFinishedPulling="2026-02-18 06:06:11.143733241 +0000 UTC m=+1068.312821473" observedRunningTime="2026-02-18 06:06:12.737875234 +0000 UTC m=+1069.906963466" watchObservedRunningTime="2026-02-18 06:06:12.747259383 +0000 UTC m=+1069.916347615" Feb 18 06:06:12 crc kubenswrapper[4869]: I0218 06:06:12.793142 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-666896fcd4-c65vb" podStartSLOduration=9.793120477 podStartE2EDuration="9.793120477s" podCreationTimestamp="2026-02-18 06:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:12.768488948 +0000 UTC m=+1069.937577180" watchObservedRunningTime="2026-02-18 06:06:12.793120477 +0000 UTC m=+1069.962208709" Feb 18 06:06:13 crc kubenswrapper[4869]: I0218 06:06:13.223506 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:13 crc kubenswrapper[4869]: I0218 06:06:13.249763 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55b8768b96-mwv6g" podStartSLOduration=7.249731262 podStartE2EDuration="7.249731262s" podCreationTimestamp="2026-02-18 06:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:12.795573927 +0000 UTC m=+1069.964662149" watchObservedRunningTime="2026-02-18 06:06:13.249731262 +0000 UTC m=+1070.418819494" Feb 18 06:06:13 crc kubenswrapper[4869]: I0218 06:06:13.485023 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" path="/var/lib/kubelet/pods/89712bb0-67ea-4538-9b51-82a5b629c048/volumes" Feb 18 06:06:14 crc kubenswrapper[4869]: I0218 06:06:14.750116 4869 generic.go:334] "Generic (PLEG): container finished" podID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" containerID="56db69629c5551f72754a181b5e9e12c994e64a0bd704b08a7755e193c82ade4" exitCode=0 Feb 18 06:06:14 crc kubenswrapper[4869]: I0218 06:06:14.750168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzcxf" event={"ID":"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5","Type":"ContainerDied","Data":"56db69629c5551f72754a181b5e9e12c994e64a0bd704b08a7755e193c82ade4"} Feb 18 06:06:14 crc kubenswrapper[4869]: I0218 06:06:14.766460 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Feb 18 06:06:14 crc kubenswrapper[4869]: I0218 06:06:14.896322 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57f5fddd88-qhh5n" podUID="391d8fe4-58ea-434e-918f-811b7c3e14b2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.219639 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.291599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle\") pod \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.291652 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data\") pod \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.291674 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hktd\" (UniqueName: \"kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd\") pod \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\" (UID: \"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5\") " Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.311367 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd" (OuterVolumeSpecName: "kube-api-access-8hktd") pod "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" (UID: "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5"). InnerVolumeSpecName "kube-api-access-8hktd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.311467 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" (UID: "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.324281 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" (UID: "ffdffd9a-f626-4bf2-b1e0-104eca55e7f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.394886 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.394921 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.394930 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hktd\" (UniqueName: \"kubernetes.io/projected/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5-kube-api-access-8hktd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.772559 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzcxf" event={"ID":"ffdffd9a-f626-4bf2-b1e0-104eca55e7f5","Type":"ContainerDied","Data":"c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e"} Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.773130 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c69d7a1eee675ad7001a560b171491ea2be16419cfc958eb747423c556eb345e" Feb 18 06:06:16 crc kubenswrapper[4869]: I0218 06:06:16.772605 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzcxf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.057594 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-98f68fcf5-pz8rv"] Feb 18 06:06:17 crc kubenswrapper[4869]: E0218 06:06:17.058010 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="init" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.058027 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="init" Feb 18 06:06:17 crc kubenswrapper[4869]: E0218 06:06:17.058044 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" containerName="barbican-db-sync" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.058051 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" containerName="barbican-db-sync" Feb 18 06:06:17 crc kubenswrapper[4869]: E0218 06:06:17.058063 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="dnsmasq-dns" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.058069 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="dnsmasq-dns" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.058271 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="89712bb0-67ea-4538-9b51-82a5b629c048" containerName="dnsmasq-dns" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.058318 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" containerName="barbican-db-sync" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.059259 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.062498 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.062718 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.062864 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f5xlh" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.082632 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-98f68fcf5-pz8rv"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.096928 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5884488646-54ch2"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.099717 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.102618 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.119520 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5884488646-54ch2"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.208846 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.212819 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data-custom\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.212880 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.212887 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.212933 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhp6\" (UniqueName: \"kubernetes.io/projected/4ed72952-b72a-4f66-8e63-84d18936ff3a-kube-api-access-fqhp6\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.212983 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-combined-ca-bundle\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213002 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq594\" (UniqueName: \"kubernetes.io/projected/c3b79627-ea10-4a59-a5ae-f24d3ace238d-kube-api-access-jq594\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data-custom\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213033 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b79627-ea10-4a59-a5ae-f24d3ace238d-logs\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-combined-ca-bundle\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213066 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ed72952-b72a-4f66-8e63-84d18936ff3a-logs\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.213104 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.217389 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314762 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314849 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314880 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhp6\" (UniqueName: \"kubernetes.io/projected/4ed72952-b72a-4f66-8e63-84d18936ff3a-kube-api-access-fqhp6\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314911 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314932 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314964 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-combined-ca-bundle\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314982 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq594\" (UniqueName: \"kubernetes.io/projected/c3b79627-ea10-4a59-a5ae-f24d3ace238d-kube-api-access-jq594\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.314999 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data-custom\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b79627-ea10-4a59-a5ae-f24d3ace238d-logs\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315031 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-combined-ca-bundle\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315047 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55kzf\" (UniqueName: \"kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315067 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ed72952-b72a-4f66-8e63-84d18936ff3a-logs\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315092 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315126 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315163 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data-custom\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.315184 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.316716 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ed72952-b72a-4f66-8e63-84d18936ff3a-logs\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.317115 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b79627-ea10-4a59-a5ae-f24d3ace238d-logs\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.319681 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data-custom\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.319762 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-combined-ca-bundle\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.322735 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-combined-ca-bundle\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.326883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.348290 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhp6\" (UniqueName: \"kubernetes.io/projected/4ed72952-b72a-4f66-8e63-84d18936ff3a-kube-api-access-fqhp6\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.349305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed72952-b72a-4f66-8e63-84d18936ff3a-config-data\") pod \"barbican-keystone-listener-5884488646-54ch2\" (UID: \"4ed72952-b72a-4f66-8e63-84d18936ff3a\") " pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.353271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq594\" (UniqueName: \"kubernetes.io/projected/c3b79627-ea10-4a59-a5ae-f24d3ace238d-kube-api-access-jq594\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.355063 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b79627-ea10-4a59-a5ae-f24d3ace238d-config-data-custom\") pod \"barbican-worker-98f68fcf5-pz8rv\" (UID: \"c3b79627-ea10-4a59-a5ae-f24d3ace238d\") " pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.385775 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.387332 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.388559 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-98f68fcf5-pz8rv" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.391776 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.402065 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55kzf\" (UniqueName: \"kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417380 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417505 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417539 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.417559 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.418399 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.422012 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.422046 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.422346 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.423293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.430463 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5884488646-54ch2" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.449380 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55kzf\" (UniqueName: \"kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf\") pod \"dnsmasq-dns-85ff748b95-c476g\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.519525 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.519698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.519903 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.519995 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.520114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888ql\" (UniqueName: \"kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.548201 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.622543 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888ql\" (UniqueName: \"kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.622713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.622762 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.622803 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.622829 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.623241 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.626327 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.628332 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.641798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.647636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888ql\" (UniqueName: \"kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql\") pod \"barbican-api-56cbc85666-kbbjf\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.740514 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.791652 4869 generic.go:334] "Generic (PLEG): container finished" podID="77d2d3cf-1108-468b-816a-64d29471542e" containerID="5710b416d77ba4c1b05282bf3850529b3d884abdc7dc8b9bfb141a5daf9abf53" exitCode=0 Feb 18 06:06:17 crc kubenswrapper[4869]: I0218 06:06:17.791702 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8v5fn" event={"ID":"77d2d3cf-1108-468b-816a-64d29471542e","Type":"ContainerDied","Data":"5710b416d77ba4c1b05282bf3850529b3d884abdc7dc8b9bfb141a5daf9abf53"} Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.890115 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85cc9b8698-mzcgf"] Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.893315 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.896149 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.896417 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.906087 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85cc9b8698-mzcgf"] Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.966985 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-public-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967315 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e28946-38e8-4b00-8181-d45908ad9863-logs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967344 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967389 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data-custom\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967461 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-internal-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967482 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-combined-ca-bundle\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:19 crc kubenswrapper[4869]: I0218 06:06:19.967504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w285\" (UniqueName: \"kubernetes.io/projected/00e28946-38e8-4b00-8181-d45908ad9863-kube-api-access-5w285\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073136 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-public-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073208 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e28946-38e8-4b00-8181-d45908ad9863-logs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073237 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073300 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data-custom\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073388 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-internal-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073406 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-combined-ca-bundle\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.073447 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w285\" (UniqueName: \"kubernetes.io/projected/00e28946-38e8-4b00-8181-d45908ad9863-kube-api-access-5w285\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.077004 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e28946-38e8-4b00-8181-d45908ad9863-logs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.083395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-public-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.083446 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-internal-tls-certs\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.083688 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data-custom\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.083986 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-combined-ca-bundle\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.084984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e28946-38e8-4b00-8181-d45908ad9863-config-data\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.100341 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w285\" (UniqueName: \"kubernetes.io/projected/00e28946-38e8-4b00-8181-d45908ad9863-kube-api-access-5w285\") pod \"barbican-api-85cc9b8698-mzcgf\" (UID: \"00e28946-38e8-4b00-8181-d45908ad9863\") " pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:20 crc kubenswrapper[4869]: I0218 06:06:20.220067 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.401090 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502027 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502084 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502170 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502236 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502261 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxn66\" (UniqueName: \"kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502431 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.502491 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data\") pod \"77d2d3cf-1108-468b-816a-64d29471542e\" (UID: \"77d2d3cf-1108-468b-816a-64d29471542e\") " Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.503312 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77d2d3cf-1108-468b-816a-64d29471542e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.507889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts" (OuterVolumeSpecName: "scripts") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.507920 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.509075 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66" (OuterVolumeSpecName: "kube-api-access-hxn66") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "kube-api-access-hxn66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.538618 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.555930 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data" (OuterVolumeSpecName: "config-data") pod "77d2d3cf-1108-468b-816a-64d29471542e" (UID: "77d2d3cf-1108-468b-816a-64d29471542e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.605179 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.605244 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxn66\" (UniqueName: \"kubernetes.io/projected/77d2d3cf-1108-468b-816a-64d29471542e-kube-api-access-hxn66\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.605257 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.605267 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.605277 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d2d3cf-1108-468b-816a-64d29471542e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.852804 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8v5fn" event={"ID":"77d2d3cf-1108-468b-816a-64d29471542e","Type":"ContainerDied","Data":"10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038"} Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.852851 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10037a653d3f631c8e2b8bd77b0e33eb27a777b6d3b6d574802873cd86653038" Feb 18 06:06:21 crc kubenswrapper[4869]: I0218 06:06:21.852932 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8v5fn" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.672433 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:22 crc kubenswrapper[4869]: E0218 06:06:22.673433 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77d2d3cf-1108-468b-816a-64d29471542e" containerName="cinder-db-sync" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.673449 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d2d3cf-1108-468b-816a-64d29471542e" containerName="cinder-db-sync" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.673703 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="77d2d3cf-1108-468b-816a-64d29471542e" containerName="cinder-db-sync" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.674858 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.685611 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zklr2" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.685856 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.685972 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.686090 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.702871 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.754319 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.754369 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.754675 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pfq\" (UniqueName: \"kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.755138 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.755219 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.755327 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.807046 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857544 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pfq\" (UniqueName: \"kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857668 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857699 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857769 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.857849 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.882771 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.884570 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.889660 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.895713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.895711 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.897846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.902212 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.906584 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pfq\" (UniqueName: \"kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq\") pod \"cinder-scheduler-0\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:22 crc kubenswrapper[4869]: I0218 06:06:22.960599 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.070328 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.073894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48k6b\" (UniqueName: \"kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.074234 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.074279 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.074304 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.074374 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.084333 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.088659 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.094344 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.097826 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176258 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176327 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176354 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176454 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176539 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176566 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48k6b\" (UniqueName: \"kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176597 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176617 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp42n\" (UniqueName: \"kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176657 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.176695 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.177990 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.178255 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.178579 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.179197 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.179287 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.200731 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48k6b\" (UniqueName: \"kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b\") pod \"dnsmasq-dns-5c9776ccc5-qd9jb\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.277965 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278269 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278411 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278443 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.278460 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp42n\" (UniqueName: \"kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.279287 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.280228 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.283944 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.285182 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.285954 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.287103 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.287957 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.299371 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp42n\" (UniqueName: \"kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n\") pod \"cinder-api-0\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.419996 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.441520 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.455050 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.571341 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5884488646-54ch2"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.578320 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-98f68fcf5-pz8rv"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.611061 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85cc9b8698-mzcgf"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.757596 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.860878 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.921569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerStarted","Data":"014c1d0430ac54f5a64ff8e03626b8287d3024a004476f824a46212f8f01de46"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.923008 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98f68fcf5-pz8rv" event={"ID":"c3b79627-ea10-4a59-a5ae-f24d3ace238d","Type":"ContainerStarted","Data":"4704349ff25a8a135536f0bd966e824f2d2b59f935bedf0cf78b7e477372b548"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.927329 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85cc9b8698-mzcgf" event={"ID":"00e28946-38e8-4b00-8181-d45908ad9863","Type":"ContainerStarted","Data":"3ea6fe6022e2afcebc3f94a230d9b5905502c4d475eb45c9faf2d9aecdb1c16d"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.928589 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-c476g" event={"ID":"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335","Type":"ContainerStarted","Data":"5661f437321aeda22b848c700a1f4a1c26e0e83a24c3209ee3a4f70b83d9b9dc"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.930212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" event={"ID":"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4","Type":"ContainerStarted","Data":"6d00df454313bd1d26df965c9a2d38a92e94d749cf51013ec9eb3877ab5a8501"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.933352 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerStarted","Data":"dbc8dd352de1a387568d2a7a19a86c3d75ca0aa8d8e171dd7dd198f80585dca9"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.933500 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-central-agent" containerID="cri-o://f09f7ddec431acb4437d2f6a245a598f31fbdd07457ae52f0d2bddb63ae30624" gracePeriod=30 Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.933737 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.934019 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="proxy-httpd" containerID="cri-o://dbc8dd352de1a387568d2a7a19a86c3d75ca0aa8d8e171dd7dd198f80585dca9" gracePeriod=30 Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.934079 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="sg-core" containerID="cri-o://cb76054c6f93b947662ea5d20ca8b46ce0707c8f6eb72b79dfdf979cd0c1f034" gracePeriod=30 Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.934117 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-notification-agent" containerID="cri-o://ddd99ad38a349df8ed1aa180885bb67905bce7cc10ac1040ed3d5c8c37bc13b5" gracePeriod=30 Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.942974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5884488646-54ch2" event={"ID":"4ed72952-b72a-4f66-8e63-84d18936ff3a","Type":"ContainerStarted","Data":"39cbe3a18dc0da377d918476812cb2ef9b18cfd49eca5765d5329362635c1fb1"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.955342 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerStarted","Data":"8241cfb2e3663f67e7354d6c6c51b02d9b2c80557c89e590e2bfb85ce1063b0d"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.955391 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerStarted","Data":"fd7d0da666cf9f41da86d769806f8e2a79b393a2bc80813938a9605d160dbc3f"} Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.971943 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.297758412 podStartE2EDuration="58.971920664s" podCreationTimestamp="2026-02-18 06:05:25 +0000 UTC" firstStartedPulling="2026-02-18 06:05:28.101338022 +0000 UTC m=+1025.270426254" lastFinishedPulling="2026-02-18 06:06:22.775500274 +0000 UTC m=+1079.944588506" observedRunningTime="2026-02-18 06:06:23.958502137 +0000 UTC m=+1081.127590369" watchObservedRunningTime="2026-02-18 06:06:23.971920664 +0000 UTC m=+1081.141008896" Feb 18 06:06:23 crc kubenswrapper[4869]: I0218 06:06:23.990540 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.932338 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.984513 4869 generic.go:334] "Generic (PLEG): container finished" podID="cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" containerID="267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060" exitCode=0 Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.984571 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-c476g" event={"ID":"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335","Type":"ContainerDied","Data":"267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060"} Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.987960 4869 generic.go:334] "Generic (PLEG): container finished" podID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerID="e33cc65acebee6f88041033d26c8405796e8012234ffbd3c6399768d8de24797" exitCode=0 Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.988015 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" event={"ID":"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4","Type":"ContainerDied","Data":"e33cc65acebee6f88041033d26c8405796e8012234ffbd3c6399768d8de24797"} Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.997937 4869 generic.go:334] "Generic (PLEG): container finished" podID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerID="dbc8dd352de1a387568d2a7a19a86c3d75ca0aa8d8e171dd7dd198f80585dca9" exitCode=0 Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.997968 4869 generic.go:334] "Generic (PLEG): container finished" podID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerID="cb76054c6f93b947662ea5d20ca8b46ce0707c8f6eb72b79dfdf979cd0c1f034" exitCode=2 Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.997975 4869 generic.go:334] "Generic (PLEG): container finished" podID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerID="f09f7ddec431acb4437d2f6a245a598f31fbdd07457ae52f0d2bddb63ae30624" exitCode=0 Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.998016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerDied","Data":"dbc8dd352de1a387568d2a7a19a86c3d75ca0aa8d8e171dd7dd198f80585dca9"} Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.998039 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerDied","Data":"cb76054c6f93b947662ea5d20ca8b46ce0707c8f6eb72b79dfdf979cd0c1f034"} Feb 18 06:06:24 crc kubenswrapper[4869]: I0218 06:06:24.998048 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerDied","Data":"f09f7ddec431acb4437d2f6a245a598f31fbdd07457ae52f0d2bddb63ae30624"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.003660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerStarted","Data":"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.003708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerStarted","Data":"fbde812e694798db1a937891f10baa11b7c399f82dfa176bb8fc15f9ff3dcbc7"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.016450 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerStarted","Data":"a4a68942bd2658b39f257bcc42ff5f9e3acb414817fdf61022d741486db2e479"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.017368 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.017397 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.038296 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85cc9b8698-mzcgf" event={"ID":"00e28946-38e8-4b00-8181-d45908ad9863","Type":"ContainerStarted","Data":"b3ee6a4e32cf6a1c5ef6b9c9c7ac207f69f0c29d6a71d1786c1681dfc2be2463"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.038341 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85cc9b8698-mzcgf" event={"ID":"00e28946-38e8-4b00-8181-d45908ad9863","Type":"ContainerStarted","Data":"f5fc89e0ee05a7f69e9aafc37dfd4b5cc7f4216465c0ef912a70cf5f18556fad"} Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.038618 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.038661 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.056617 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56cbc85666-kbbjf" podStartSLOduration=8.056600619 podStartE2EDuration="8.056600619s" podCreationTimestamp="2026-02-18 06:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:25.052369376 +0000 UTC m=+1082.221457608" watchObservedRunningTime="2026-02-18 06:06:25.056600619 +0000 UTC m=+1082.225688851" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.484340 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.541259 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85cc9b8698-mzcgf" podStartSLOduration=6.541238374 podStartE2EDuration="6.541238374s" podCreationTimestamp="2026-02-18 06:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:25.084472946 +0000 UTC m=+1082.253561178" watchObservedRunningTime="2026-02-18 06:06:25.541238374 +0000 UTC m=+1082.710326606" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.751279 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857330 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857419 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857463 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55kzf\" (UniqueName: \"kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857650 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857685 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.857717 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb\") pod \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\" (UID: \"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335\") " Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.899551 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.899857 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5466c5cc6f-xd5xf" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-api" containerID="cri-o://fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287" gracePeriod=30 Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.906081 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5466c5cc6f-xd5xf" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" containerID="cri-o://00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089" gracePeriod=30 Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.914129 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf" (OuterVolumeSpecName: "kube-api-access-55kzf") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "kube-api-access-55kzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.924262 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.932879 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.946053 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.967394 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.967824 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.967836 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.967844 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55kzf\" (UniqueName: \"kubernetes.io/projected/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-kube-api-access-55kzf\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.969814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.974427 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d4594595c-zvnfb"] Feb 18 06:06:25 crc kubenswrapper[4869]: E0218 06:06:25.974848 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" containerName="init" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.974860 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" containerName="init" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.975043 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" containerName="init" Feb 18 06:06:25 crc kubenswrapper[4869]: I0218 06:06:25.976149 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.029939 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config" (OuterVolumeSpecName: "config") pod "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" (UID: "cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.070916 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.070943 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.169775 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d4594595c-zvnfb"] Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178337 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-ovndb-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178410 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-httpd-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178498 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-internal-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-combined-ca-bundle\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178596 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-public-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.178705 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc29\" (UniqueName: \"kubernetes.io/projected/dffbc8b7-8080-4958-915d-ee66f5ae732b-kube-api-access-9sc29\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.185110 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerStarted","Data":"2b674ef05b97689ad31861ae1b0ac85b1b20724d20ada4f458e120c9993aef35"} Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.203541 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-c476g" event={"ID":"cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335","Type":"ContainerDied","Data":"5661f437321aeda22b848c700a1f4a1c26e0e83a24c3209ee3a4f70b83d9b9dc"} Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.203625 4869 scope.go:117] "RemoveContainer" containerID="267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.203851 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-c476g" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.272389 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.279573 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-c476g"] Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-internal-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281277 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-combined-ca-bundle\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281316 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-public-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc29\" (UniqueName: \"kubernetes.io/projected/dffbc8b7-8080-4958-915d-ee66f5ae732b-kube-api-access-9sc29\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281435 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-ovndb-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.281559 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-httpd-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.288346 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-ovndb-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.292343 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-public-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.292718 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-combined-ca-bundle\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.292995 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-internal-tls-certs\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.294532 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.314663 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dffbc8b7-8080-4958-915d-ee66f5ae732b-httpd-config\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.320555 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc29\" (UniqueName: \"kubernetes.io/projected/dffbc8b7-8080-4958-915d-ee66f5ae732b-kube-api-access-9sc29\") pod \"neutron-6d4594595c-zvnfb\" (UID: \"dffbc8b7-8080-4958-915d-ee66f5ae732b\") " pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.377494 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5466c5cc6f-xd5xf" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": read tcp 10.217.0.2:47832->10.217.0.156:9696: read: connection reset by peer" Feb 18 06:06:26 crc kubenswrapper[4869]: I0218 06:06:26.415159 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.234130 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" event={"ID":"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4","Type":"ContainerStarted","Data":"86821b867917f19b5e7cf53b0d8f4e915c7e763184a8c5a5374ada8f15d21bb2"} Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.235218 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:27 crc kubenswrapper[4869]: W0218 06:06:27.248206 4869 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5ed2006_0bdb_4fa9_a336_a6b5b7357ac4.slice/crio-e33cc65acebee6f88041033d26c8405796e8012234ffbd3c6399768d8de24797.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5ed2006_0bdb_4fa9_a336_a6b5b7357ac4.slice/crio-e33cc65acebee6f88041033d26c8405796e8012234ffbd3c6399768d8de24797.scope: no such file or directory Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.253125 4869 generic.go:334] "Generic (PLEG): container finished" podID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerID="133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210" exitCode=137 Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.253213 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerDied","Data":"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210"} Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.256714 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" podStartSLOduration=5.256693665 podStartE2EDuration="5.256693665s" podCreationTimestamp="2026-02-18 06:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:27.253310584 +0000 UTC m=+1084.422398816" watchObservedRunningTime="2026-02-18 06:06:27.256693665 +0000 UTC m=+1084.425781897" Feb 18 06:06:27 crc kubenswrapper[4869]: W0218 06:06:27.257191 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf7eb3a7_e11d_44c4_bec1_4f8b9a67b335.slice/crio-267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060.scope WatchSource:0}: Error finding container 267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060: Status 404 returned error can't find the container with id 267d1c742831a98c68c30fd423ca41b05928824fd5b68e811e0ac74cca82e060 Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.265524 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5884488646-54ch2" event={"ID":"4ed72952-b72a-4f66-8e63-84d18936ff3a","Type":"ContainerStarted","Data":"f1a832daa3d6cb198cca284c64fb99950bc74badbb00d30b2d852d5e53de1157"} Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.272446 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerID="00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089" exitCode=0 Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.275919 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerDied","Data":"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089"} Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.403144 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d4594595c-zvnfb"] Feb 18 06:06:27 crc kubenswrapper[4869]: W0218 06:06:27.480120 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffbc8b7_8080_4958_915d_ee66f5ae732b.slice/crio-c591f23da29a9443f28225c221a5ac1ea3c9b03509b391f6291438ac790bf9dc WatchSource:0}: Error finding container c591f23da29a9443f28225c221a5ac1ea3c9b03509b391f6291438ac790bf9dc: Status 404 returned error can't find the container with id c591f23da29a9443f28225c221a5ac1ea3c9b03509b391f6291438ac790bf9dc Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.500487 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335" path="/var/lib/kubelet/pods/cf7eb3a7-e11d-44c4-bec1-4f8b9a67b335/volumes" Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.501326 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:06:27 crc kubenswrapper[4869]: I0218 06:06:27.818500 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.303380 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.323594 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerStarted","Data":"66d076ecdfaed3c8113dfb58c15b90a792d6bfbbb8f2080fd7a314b410017b9e"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.347423 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.352739 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98f68fcf5-pz8rv" event={"ID":"c3b79627-ea10-4a59-a5ae-f24d3ace238d","Type":"ContainerStarted","Data":"4c4e54b78dbcfef1fe36ca7d080b6c70781e69b27485976e228d3ad47a7c3e85"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.352829 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-98f68fcf5-pz8rv" event={"ID":"c3b79627-ea10-4a59-a5ae-f24d3ace238d","Type":"ContainerStarted","Data":"7fdf76e3bd79f3a505d57608731405d41a3c71c308ccb023997c89ab733b47e7"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.354783 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4594595c-zvnfb" event={"ID":"dffbc8b7-8080-4958-915d-ee66f5ae732b","Type":"ContainerStarted","Data":"689e152ea2102a873fd022a283ac251c9d61fdd471d9cdfdf3b3abdef6c9ec2f"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.354839 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4594595c-zvnfb" event={"ID":"dffbc8b7-8080-4958-915d-ee66f5ae732b","Type":"ContainerStarted","Data":"c591f23da29a9443f28225c221a5ac1ea3c9b03509b391f6291438ac790bf9dc"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.357939 4869 generic.go:334] "Generic (PLEG): container finished" podID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerID="aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" exitCode=137 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.357971 4869 generic.go:334] "Generic (PLEG): container finished" podID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerID="6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" exitCode=137 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.358019 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerDied","Data":"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.358066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerDied","Data":"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.358078 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76444b47f5-mc7kc" event={"ID":"e4fa15d8-bacc-4ce0-bd25-41e451404ab3","Type":"ContainerDied","Data":"a87ca12110732479bd31bc54a3166cf1a592f9c7963d09e005d603c4b93c5196"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.358096 4869 scope.go:117] "RemoveContainer" containerID="aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.364048 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76444b47f5-mc7kc" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.369591 4869 generic.go:334] "Generic (PLEG): container finished" podID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerID="ddd99ad38a349df8ed1aa180885bb67905bce7cc10ac1040ed3d5c8c37bc13b5" exitCode=0 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.369665 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerDied","Data":"ddd99ad38a349df8ed1aa180885bb67905bce7cc10ac1040ed3d5c8c37bc13b5"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.411101 4869 generic.go:334] "Generic (PLEG): container finished" podID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerID="0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1" exitCode=137 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.411197 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerDied","Data":"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.411233 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588975f6bf-9447r" event={"ID":"0b9571fd-7aa2-4e30-81f9-465a9c4291c8","Type":"ContainerDied","Data":"74f418bd00b17e46b978b01deb0d626d6353739e0224c848a7e2fd3390f8ef04"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.411360 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588975f6bf-9447r" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.418009 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.718242739 podStartE2EDuration="6.417983811s" podCreationTimestamp="2026-02-18 06:06:22 +0000 UTC" firstStartedPulling="2026-02-18 06:06:23.776597848 +0000 UTC m=+1080.945686080" lastFinishedPulling="2026-02-18 06:06:24.47633892 +0000 UTC m=+1081.645427152" observedRunningTime="2026-02-18 06:06:28.363243231 +0000 UTC m=+1085.532331463" watchObservedRunningTime="2026-02-18 06:06:28.417983811 +0000 UTC m=+1085.587072043" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.423682 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.437354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5884488646-54ch2" event={"ID":"4ed72952-b72a-4f66-8e63-84d18936ff3a","Type":"ContainerStarted","Data":"2fa9f281d7d2818aed812a285f3cc780b106bea6fab583b6b72dffbb65bae884"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454520 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key\") pod \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454651 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn9nr\" (UniqueName: \"kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr\") pod \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs\") pod \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454774 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data\") pod \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454834 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs\") pod \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454853 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzmn8\" (UniqueName: \"kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8\") pod \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454888 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key\") pod \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454958 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts\") pod \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.454999 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data\") pod \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\" (UID: \"e4fa15d8-bacc-4ce0-bd25-41e451404ab3\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.455052 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts\") pod \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\" (UID: \"0b9571fd-7aa2-4e30-81f9-465a9c4291c8\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.455488 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs" (OuterVolumeSpecName: "logs") pod "0b9571fd-7aa2-4e30-81f9-465a9c4291c8" (UID: "0b9571fd-7aa2-4e30-81f9-465a9c4291c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.456170 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-98f68fcf5-pz8rv" podStartSLOduration=8.262339008 podStartE2EDuration="11.456146739s" podCreationTimestamp="2026-02-18 06:06:17 +0000 UTC" firstStartedPulling="2026-02-18 06:06:23.598650864 +0000 UTC m=+1080.767739096" lastFinishedPulling="2026-02-18 06:06:26.792458595 +0000 UTC m=+1083.961546827" observedRunningTime="2026-02-18 06:06:28.389070149 +0000 UTC m=+1085.558158381" watchObservedRunningTime="2026-02-18 06:06:28.456146739 +0000 UTC m=+1085.625234971" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.463278 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs" (OuterVolumeSpecName: "logs") pod "e4fa15d8-bacc-4ce0-bd25-41e451404ab3" (UID: "e4fa15d8-bacc-4ce0-bd25-41e451404ab3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.467647 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api-log" containerID="cri-o://09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" gracePeriod=30 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.468694 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerStarted","Data":"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8"} Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.468771 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.468915 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api" containerID="cri-o://03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" gracePeriod=30 Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.473887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr" (OuterVolumeSpecName: "kube-api-access-bn9nr") pod "0b9571fd-7aa2-4e30-81f9-465a9c4291c8" (UID: "0b9571fd-7aa2-4e30-81f9-465a9c4291c8"). InnerVolumeSpecName "kube-api-access-bn9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.473945 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8" (OuterVolumeSpecName: "kube-api-access-hzmn8") pod "e4fa15d8-bacc-4ce0-bd25-41e451404ab3" (UID: "e4fa15d8-bacc-4ce0-bd25-41e451404ab3"). InnerVolumeSpecName "kube-api-access-hzmn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.474040 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0b9571fd-7aa2-4e30-81f9-465a9c4291c8" (UID: "0b9571fd-7aa2-4e30-81f9-465a9c4291c8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.483295 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e4fa15d8-bacc-4ce0-bd25-41e451404ab3" (UID: "e4fa15d8-bacc-4ce0-bd25-41e451404ab3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.524214 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts" (OuterVolumeSpecName: "scripts") pod "e4fa15d8-bacc-4ce0-bd25-41e451404ab3" (UID: "e4fa15d8-bacc-4ce0-bd25-41e451404ab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.525591 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts" (OuterVolumeSpecName: "scripts") pod "0b9571fd-7aa2-4e30-81f9-465a9c4291c8" (UID: "0b9571fd-7aa2-4e30-81f9-465a9c4291c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.526751 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data" (OuterVolumeSpecName: "config-data") pod "e4fa15d8-bacc-4ce0-bd25-41e451404ab3" (UID: "e4fa15d8-bacc-4ce0-bd25-41e451404ab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.527736 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data" (OuterVolumeSpecName: "config-data") pod "0b9571fd-7aa2-4e30-81f9-465a9c4291c8" (UID: "0b9571fd-7aa2-4e30-81f9-465a9c4291c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.546433 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.546409012 podStartE2EDuration="5.546409012s" podCreationTimestamp="2026-02-18 06:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:28.534249846 +0000 UTC m=+1085.703338088" watchObservedRunningTime="2026-02-18 06:06:28.546409012 +0000 UTC m=+1085.715497244" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558202 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558321 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558454 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558514 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558534 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558556 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fbh\" (UniqueName: \"kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.558606 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle\") pod \"91512d0d-84f7-41c0-aca8-cbf9d2839927\" (UID: \"91512d0d-84f7-41c0-aca8-cbf9d2839927\") " Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559538 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559555 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzmn8\" (UniqueName: \"kubernetes.io/projected/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-kube-api-access-hzmn8\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559569 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559579 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559592 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559602 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4fa15d8-bacc-4ce0-bd25-41e451404ab3-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559611 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559624 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559639 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn9nr\" (UniqueName: \"kubernetes.io/projected/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-kube-api-access-bn9nr\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.559650 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b9571fd-7aa2-4e30-81f9-465a9c4291c8-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.566431 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.569376 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.589286 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts" (OuterVolumeSpecName: "scripts") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.589580 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh" (OuterVolumeSpecName: "kube-api-access-45fbh") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "kube-api-access-45fbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.608896 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5884488646-54ch2" podStartSLOduration=8.447144648 podStartE2EDuration="11.60887213s" podCreationTimestamp="2026-02-18 06:06:17 +0000 UTC" firstStartedPulling="2026-02-18 06:06:23.590063085 +0000 UTC m=+1080.759151317" lastFinishedPulling="2026-02-18 06:06:26.751790567 +0000 UTC m=+1083.920878799" observedRunningTime="2026-02-18 06:06:28.601228884 +0000 UTC m=+1085.770317116" watchObservedRunningTime="2026-02-18 06:06:28.60887213 +0000 UTC m=+1085.777960362" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.656467 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.663636 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.663672 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.663682 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.663692 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91512d0d-84f7-41c0-aca8-cbf9d2839927-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.663702 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fbh\" (UniqueName: \"kubernetes.io/projected/91512d0d-84f7-41c0-aca8-cbf9d2839927-kube-api-access-45fbh\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.673540 4869 scope.go:117] "RemoveContainer" containerID="6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.719435 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.748913 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data" (OuterVolumeSpecName: "config-data") pod "91512d0d-84f7-41c0-aca8-cbf9d2839927" (UID: "91512d0d-84f7-41c0-aca8-cbf9d2839927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.767005 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.767043 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91512d0d-84f7-41c0-aca8-cbf9d2839927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.850557 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.854068 4869 scope.go:117] "RemoveContainer" containerID="aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" Feb 18 06:06:28 crc kubenswrapper[4869]: E0218 06:06:28.857973 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313\": container with ID starting with aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313 not found: ID does not exist" containerID="aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.858021 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313"} err="failed to get container status \"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313\": rpc error: code = NotFound desc = could not find container \"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313\": container with ID starting with aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313 not found: ID does not exist" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.858052 4869 scope.go:117] "RemoveContainer" containerID="6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" Feb 18 06:06:28 crc kubenswrapper[4869]: E0218 06:06:28.861774 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0\": container with ID starting with 6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0 not found: ID does not exist" containerID="6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.861813 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0"} err="failed to get container status \"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0\": rpc error: code = NotFound desc = could not find container \"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0\": container with ID starting with 6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0 not found: ID does not exist" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.861835 4869 scope.go:117] "RemoveContainer" containerID="aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.863601 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313"} err="failed to get container status \"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313\": rpc error: code = NotFound desc = could not find container \"aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313\": container with ID starting with aaf2b7c22977689767c686369358b3000da99eb4f168cee6c4f01c96f3fd4313 not found: ID does not exist" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.863624 4869 scope.go:117] "RemoveContainer" containerID="6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.864579 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0"} err="failed to get container status \"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0\": rpc error: code = NotFound desc = could not find container \"6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0\": container with ID starting with 6b53a17c713e4c990e9ff86e1c691ccfe97fd8d5df4122747f7c2708ef7013b0 not found: ID does not exist" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.864600 4869 scope.go:117] "RemoveContainer" containerID="0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1" Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.896757 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-588975f6bf-9447r"] Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.906584 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:06:28 crc kubenswrapper[4869]: I0218 06:06:28.911653 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76444b47f5-mc7kc"] Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.066088 4869 scope.go:117] "RemoveContainer" containerID="133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.135525 4869 scope.go:117] "RemoveContainer" containerID="0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.136057 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1\": container with ID starting with 0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1 not found: ID does not exist" containerID="0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.136115 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1"} err="failed to get container status \"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1\": rpc error: code = NotFound desc = could not find container \"0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1\": container with ID starting with 0c33ee7dce2e3e8ae9c91cf94f0415ddb5d6ceecec750a69841709572791bcb1 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.136139 4869 scope.go:117] "RemoveContainer" containerID="133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.136385 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210\": container with ID starting with 133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210 not found: ID does not exist" containerID="133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.136409 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210"} err="failed to get container status \"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210\": rpc error: code = NotFound desc = could not find container \"133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210\": container with ID starting with 133059aaf4bdfa3e2a9413f5b02bf21f805d0f540f64d9c0062fd942f5366210 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.316501 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5466c5cc6f-xd5xf" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.458363 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.478213 4869 generic.go:334] "Generic (PLEG): container finished" podID="1565c2f4-e8d2-48ac-944e-f143856732db" containerID="03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" exitCode=0 Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.478247 4869 generic.go:334] "Generic (PLEG): container finished" podID="1565c2f4-e8d2-48ac-944e-f143856732db" containerID="09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" exitCode=143 Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.478317 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.481571 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" path="/var/lib/kubelet/pods/0b9571fd-7aa2-4e30-81f9-465a9c4291c8/volumes" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.482287 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" path="/var/lib/kubelet/pods/e4fa15d8-bacc-4ce0-bd25-41e451404ab3/volumes" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495761 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerDied","Data":"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8"} Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495858 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerDied","Data":"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89"} Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495877 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1565c2f4-e8d2-48ac-944e-f143856732db","Type":"ContainerDied","Data":"fbde812e694798db1a937891f10baa11b7c399f82dfa176bb8fc15f9ff3dcbc7"} Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495891 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d4594595c-zvnfb" event={"ID":"dffbc8b7-8080-4958-915d-ee66f5ae732b","Type":"ContainerStarted","Data":"ed0e95f05ddcef44af0f5d608a2b55b0f5fc171c3f3229dc31b4a2e6aa394571"} Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.495919 4869 scope.go:117] "RemoveContainer" containerID="03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.497930 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91512d0d-84f7-41c0-aca8-cbf9d2839927","Type":"ContainerDied","Data":"4224db997a002b231eaa2717c1b6efdec1847ee9262636ac76560dd4d286df53"} Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.498198 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.521384 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d4594595c-zvnfb" podStartSLOduration=4.521362841 podStartE2EDuration="4.521362841s" podCreationTimestamp="2026-02-18 06:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:29.51597281 +0000 UTC m=+1086.685061042" watchObservedRunningTime="2026-02-18 06:06:29.521362841 +0000 UTC m=+1086.690451073" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.542126 4869 scope.go:117] "RemoveContainer" containerID="09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589009 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589079 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589157 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp42n\" (UniqueName: \"kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589223 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589260 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589302 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.589332 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs\") pod \"1565c2f4-e8d2-48ac-944e-f143856732db\" (UID: \"1565c2f4-e8d2-48ac-944e-f143856732db\") " Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.592114 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.595252 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.597701 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1565c2f4-e8d2-48ac-944e-f143856732db-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.606295 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs" (OuterVolumeSpecName: "logs") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.610262 4869 scope.go:117] "RemoveContainer" containerID="03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.614214 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8\": container with ID starting with 03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8 not found: ID does not exist" containerID="03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.614266 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8"} err="failed to get container status \"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8\": rpc error: code = NotFound desc = could not find container \"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8\": container with ID starting with 03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.614298 4869 scope.go:117] "RemoveContainer" containerID="09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.615982 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.616012 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n" (OuterVolumeSpecName: "kube-api-access-fp42n") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "kube-api-access-fp42n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.616452 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89\": container with ID starting with 09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89 not found: ID does not exist" containerID="09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.616512 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89"} err="failed to get container status \"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89\": rpc error: code = NotFound desc = could not find container \"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89\": container with ID starting with 09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.616550 4869 scope.go:117] "RemoveContainer" containerID="03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.617039 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8"} err="failed to get container status \"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8\": rpc error: code = NotFound desc = could not find container \"03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8\": container with ID starting with 03ff2cb1b200693813e0239e3bb0d9f24e7713d139dccd116b4f714edf9c92e8 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.617055 4869 scope.go:117] "RemoveContainer" containerID="09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.617230 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89"} err="failed to get container status \"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89\": rpc error: code = NotFound desc = could not find container \"09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89\": container with ID starting with 09f47e45b0173e0cf0604420dde417e0878620fe75c2fe30d52fc3c8ca000a89 not found: ID does not exist" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.617245 4869 scope.go:117] "RemoveContainer" containerID="dbc8dd352de1a387568d2a7a19a86c3d75ca0aa8d8e171dd7dd198f80585dca9" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.617854 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.618135 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts" (OuterVolumeSpecName: "scripts") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.626873 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627329 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627344 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api-log" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627364 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-central-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627370 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-central-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627384 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627390 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627396 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="proxy-httpd" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627401 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="proxy-httpd" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627412 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627417 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627431 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627437 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627445 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627451 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627463 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="sg-core" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627468 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="sg-core" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627486 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627491 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: E0218 06:06:29.627510 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-notification-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627515 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-notification-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627775 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="sg-core" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627786 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627796 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627807 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="proxy-httpd" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627818 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-notification-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627829 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" containerName="cinder-api-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627840 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9571fd-7aa2-4e30-81f9-465a9c4291c8" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627852 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" containerName="ceilometer-central-agent" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627861 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.627871 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fa15d8-bacc-4ce0-bd25-41e451404ab3" containerName="horizon-log" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.629771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.630288 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.634228 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.634425 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.644607 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.692717 4869 scope.go:117] "RemoveContainer" containerID="cb76054c6f93b947662ea5d20ca8b46ce0707c8f6eb72b79dfdf979cd0c1f034" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698731 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tnhc\" (UniqueName: \"kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698819 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698839 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698914 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698930 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.698944 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.699024 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.699036 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp42n\" (UniqueName: \"kubernetes.io/projected/1565c2f4-e8d2-48ac-944e-f143856732db-kube-api-access-fp42n\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.699047 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.699055 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.699063 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1565c2f4-e8d2-48ac-944e-f143856732db-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.718918 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data" (OuterVolumeSpecName: "config-data") pod "1565c2f4-e8d2-48ac-944e-f143856732db" (UID: "1565c2f4-e8d2-48ac-944e-f143856732db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802506 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802538 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802669 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tnhc\" (UniqueName: \"kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802727 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802778 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802805 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.802890 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1565c2f4-e8d2-48ac-944e-f143856732db-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.807275 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.811995 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.812838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.825640 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.830407 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tnhc\" (UniqueName: \"kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.830669 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.835069 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd\") pod \"ceilometer-0\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.924403 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.948406 4869 scope.go:117] "RemoveContainer" containerID="ddd99ad38a349df8ed1aa180885bb67905bce7cc10ac1040ed3d5c8c37bc13b5" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.952125 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.989462 4869 scope.go:117] "RemoveContainer" containerID="f09f7ddec431acb4437d2f6a245a598f31fbdd07457ae52f0d2bddb63ae30624" Feb 18 06:06:29 crc kubenswrapper[4869]: I0218 06:06:29.997845 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.009812 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.040771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.042154 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.044140 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.044307 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.044388 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.111703 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114259 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114341 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114372 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114472 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e110f65d-fa29-4024-a0d8-352543bd0c1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114501 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-scripts\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114516 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114550 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e110f65d-fa29-4024-a0d8-352543bd0c1b-logs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.114600 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqkb\" (UniqueName: \"kubernetes.io/projected/e110f65d-fa29-4024-a0d8-352543bd0c1b-kube-api-access-qxqkb\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218395 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218449 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218507 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e110f65d-fa29-4024-a0d8-352543bd0c1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218527 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-scripts\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218542 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218564 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e110f65d-fa29-4024-a0d8-352543bd0c1b-logs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218593 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqkb\" (UniqueName: \"kubernetes.io/projected/e110f65d-fa29-4024-a0d8-352543bd0c1b-kube-api-access-qxqkb\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218654 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.218692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.219979 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e110f65d-fa29-4024-a0d8-352543bd0c1b-logs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.223327 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57f5fddd88-qhh5n" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.224379 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e110f65d-fa29-4024-a0d8-352543bd0c1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.228229 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.229037 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-config-data\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.232695 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.233285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-scripts\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.234432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.238899 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e110f65d-fa29-4024-a0d8-352543bd0c1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.240568 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqkb\" (UniqueName: \"kubernetes.io/projected/e110f65d-fa29-4024-a0d8-352543bd0c1b-kube-api-access-qxqkb\") pod \"cinder-api-0\" (UID: \"e110f65d-fa29-4024-a0d8-352543bd0c1b\") " pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.304556 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.308680 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.420485 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.482688 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.513022 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" containerID="cri-o://6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c" gracePeriod=30 Feb 18 06:06:30 crc kubenswrapper[4869]: I0218 06:06:30.513026 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon-log" containerID="cri-o://fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b" gracePeriod=30 Feb 18 06:06:30 crc kubenswrapper[4869]: W0218 06:06:30.935722 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod207fd9cd_d5eb_4821_974c_5744bbbd981b.slice/crio-4d3fa2b8e806fcf13af1d2148077782b76f1d5f0ba0470ef686edd826cfef5dc WatchSource:0}: Error finding container 4d3fa2b8e806fcf13af1d2148077782b76f1d5f0ba0470ef686edd826cfef5dc: Status 404 returned error can't find the container with id 4d3fa2b8e806fcf13af1d2148077782b76f1d5f0ba0470ef686edd826cfef5dc Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.466496 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:31 crc kubenswrapper[4869]: W0218 06:06:31.469109 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode110f65d_fa29_4024_a0d8_352543bd0c1b.slice/crio-3844530dd8a43c350c3c680f00c8e9147585a0c30d754cab8743be5de104cf58 WatchSource:0}: Error finding container 3844530dd8a43c350c3c680f00c8e9147585a0c30d754cab8743be5de104cf58: Status 404 returned error can't find the container with id 3844530dd8a43c350c3c680f00c8e9147585a0c30d754cab8743be5de104cf58 Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.480363 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1565c2f4-e8d2-48ac-944e-f143856732db" path="/var/lib/kubelet/pods/1565c2f4-e8d2-48ac-944e-f143856732db/volumes" Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.481394 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91512d0d-84f7-41c0-aca8-cbf9d2839927" path="/var/lib/kubelet/pods/91512d0d-84f7-41c0-aca8-cbf9d2839927/volumes" Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.526481 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e110f65d-fa29-4024-a0d8-352543bd0c1b","Type":"ContainerStarted","Data":"3844530dd8a43c350c3c680f00c8e9147585a0c30d754cab8743be5de104cf58"} Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.533454 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerStarted","Data":"4d3fa2b8e806fcf13af1d2148077782b76f1d5f0ba0470ef686edd826cfef5dc"} Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.868910 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:31 crc kubenswrapper[4869]: I0218 06:06:31.918618 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.033526 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85cc9b8698-mzcgf" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.138952 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.139618 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" containerID="cri-o://8241cfb2e3663f67e7354d6c6c51b02d9b2c80557c89e590e2bfb85ce1063b0d" gracePeriod=30 Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.140110 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" containerID="cri-o://a4a68942bd2658b39f257bcc42ff5f9e3acb414817fdf61022d741486db2e479" gracePeriod=30 Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.145994 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.146138 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.440079 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486627 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486710 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcv7\" (UniqueName: \"kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486736 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486782 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486814 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.486867 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.487118 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.492905 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.510917 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7" (OuterVolumeSpecName: "kube-api-access-kpcv7") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "kube-api-access-kpcv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.590954 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.591648 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") pod \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\" (UID: \"2f13a3e9-97c1-4eaa-a0fb-f449a201a542\") " Feb 18 06:06:32 crc kubenswrapper[4869]: W0218 06:06:32.611213 4869 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2f13a3e9-97c1-4eaa-a0fb-f449a201a542/volumes/kubernetes.io~secret/public-tls-certs Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.611250 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.613179 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcv7\" (UniqueName: \"kubernetes.io/projected/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-kube-api-access-kpcv7\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.613204 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.613223 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.614613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerStarted","Data":"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.614656 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerStarted","Data":"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.621913 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e110f65d-fa29-4024-a0d8-352543bd0c1b","Type":"ContainerStarted","Data":"1a38b285d601bbedfafb031df62fe9f48d3cbe3d228b677b3cc857fe7bda5e66"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.628661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerDied","Data":"8241cfb2e3663f67e7354d6c6c51b02d9b2c80557c89e590e2bfb85ce1063b0d"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.631951 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config" (OuterVolumeSpecName: "config") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.655818 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerID="8241cfb2e3663f67e7354d6c6c51b02d9b2c80557c89e590e2bfb85ce1063b0d" exitCode=143 Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.680700 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerID="fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287" exitCode=0 Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.683012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerDied","Data":"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.717374 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5466c5cc6f-xd5xf" event={"ID":"2f13a3e9-97c1-4eaa-a0fb-f449a201a542","Type":"ContainerDied","Data":"54f75c82a3746f651d157cf929936ad5c90114789f51a5d0745301c521c51428"} Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.714735 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.683234 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5466c5cc6f-xd5xf" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.717882 4869 scope.go:117] "RemoveContainer" containerID="00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.719481 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.764601 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.779985 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2f13a3e9-97c1-4eaa-a0fb-f449a201a542" (UID: "2f13a3e9-97c1-4eaa-a0fb-f449a201a542"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.819467 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.819505 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.819514 4869 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f13a3e9-97c1-4eaa-a0fb-f449a201a542-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.900344 4869 scope.go:117] "RemoveContainer" containerID="fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.943496 4869 scope.go:117] "RemoveContainer" containerID="00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089" Feb 18 06:06:32 crc kubenswrapper[4869]: E0218 06:06:32.944225 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089\": container with ID starting with 00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089 not found: ID does not exist" containerID="00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.944290 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089"} err="failed to get container status \"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089\": rpc error: code = NotFound desc = could not find container \"00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089\": container with ID starting with 00595b0d60cbad75fbb436effe6268e3f8744bf8bf10b2a142cfd6cc2a285089 not found: ID does not exist" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.944333 4869 scope.go:117] "RemoveContainer" containerID="fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287" Feb 18 06:06:32 crc kubenswrapper[4869]: E0218 06:06:32.944822 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287\": container with ID starting with fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287 not found: ID does not exist" containerID="fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.944854 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287"} err="failed to get container status \"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287\": rpc error: code = NotFound desc = could not find container \"fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287\": container with ID starting with fe797e117876091705bdd0135f414262e2fd9f9a35401ac398945c6ef78e7287 not found: ID does not exist" Feb 18 06:06:32 crc kubenswrapper[4869]: I0218 06:06:32.962200 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.066783 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.073787 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5466c5cc6f-xd5xf"] Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.215208 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.289986 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.357331 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.357621 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="dnsmasq-dns" containerID="cri-o://36fdbafff93225d72b82cafe24fdc9a0997a1ed8ebf247394f3ce2dfb7a47b77" gracePeriod=10 Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.493014 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" path="/var/lib/kubelet/pods/2f13a3e9-97c1-4eaa-a0fb-f449a201a542/volumes" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.746691 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e110f65d-fa29-4024-a0d8-352543bd0c1b","Type":"ContainerStarted","Data":"6c1cbfc6dc60e0ff9647ce556fe90101eed8f729ec697c86edc84604549a55b8"} Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.747604 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.786546 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerStarted","Data":"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb"} Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.794163 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.794132288 podStartE2EDuration="4.794132288s" podCreationTimestamp="2026-02-18 06:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:33.771076239 +0000 UTC m=+1090.940164471" watchObservedRunningTime="2026-02-18 06:06:33.794132288 +0000 UTC m=+1090.963220520" Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.795623 4869 generic.go:334] "Generic (PLEG): container finished" podID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerID="36fdbafff93225d72b82cafe24fdc9a0997a1ed8ebf247394f3ce2dfb7a47b77" exitCode=0 Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.795705 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" event={"ID":"1467815e-0912-4dc2-b87d-4cab891b93b2","Type":"ContainerDied","Data":"36fdbafff93225d72b82cafe24fdc9a0997a1ed8ebf247394f3ce2dfb7a47b77"} Feb 18 06:06:33 crc kubenswrapper[4869]: I0218 06:06:33.911535 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.051587 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.168452 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.170301 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.170481 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.170735 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.170899 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgxpp\" (UniqueName: \"kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.171025 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0\") pod \"1467815e-0912-4dc2-b87d-4cab891b93b2\" (UID: \"1467815e-0912-4dc2-b87d-4cab891b93b2\") " Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.179263 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp" (OuterVolumeSpecName: "kube-api-access-sgxpp") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "kube-api-access-sgxpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.253551 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.260503 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config" (OuterVolumeSpecName: "config") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.274756 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.274794 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.274807 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgxpp\" (UniqueName: \"kubernetes.io/projected/1467815e-0912-4dc2-b87d-4cab891b93b2-kube-api-access-sgxpp\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.284226 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.294869 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.343287 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1467815e-0912-4dc2-b87d-4cab891b93b2" (UID: "1467815e-0912-4dc2-b87d-4cab891b93b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.376596 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.376646 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.376662 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1467815e-0912-4dc2-b87d-4cab891b93b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.765028 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.805722 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" event={"ID":"1467815e-0912-4dc2-b87d-4cab891b93b2","Type":"ContainerDied","Data":"27eb9e1c9d3f083d7817ef7b0b61d41a107254768860c4fdd14c12d72e2d2420"} Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.806052 4869 scope.go:117] "RemoveContainer" containerID="36fdbafff93225d72b82cafe24fdc9a0997a1ed8ebf247394f3ce2dfb7a47b77" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.805773 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7vv8w" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.810550 4869 generic.go:334] "Generic (PLEG): container finished" podID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerID="6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c" exitCode=0 Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.810966 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="cinder-scheduler" containerID="cri-o://2b674ef05b97689ad31861ae1b0ac85b1b20724d20ada4f458e120c9993aef35" gracePeriod=30 Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.811188 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerDied","Data":"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c"} Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.811233 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="probe" containerID="cri-o://66d076ecdfaed3c8113dfb58c15b90a792d6bfbbb8f2080fd7a314b410017b9e" gracePeriod=30 Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.842984 4869 scope.go:117] "RemoveContainer" containerID="1a6299fde2a547fc6e0eff4cfb840ad831bec2b12166e35da121687af42e739d" Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.864264 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:06:34 crc kubenswrapper[4869]: I0218 06:06:34.899061 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7vv8w"] Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.481809 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" path="/var/lib/kubelet/pods/1467815e-0912-4dc2-b87d-4cab891b93b2/volumes" Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.826163 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-666896fcd4-c65vb" Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.826546 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerStarted","Data":"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1"} Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.827038 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.831268 4869 generic.go:334] "Generic (PLEG): container finished" podID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerID="66d076ecdfaed3c8113dfb58c15b90a792d6bfbbb8f2080fd7a314b410017b9e" exitCode=0 Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.831306 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerDied","Data":"66d076ecdfaed3c8113dfb58c15b90a792d6bfbbb8f2080fd7a314b410017b9e"} Feb 18 06:06:35 crc kubenswrapper[4869]: I0218 06:06:35.887103 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.846822683 podStartE2EDuration="6.887080352s" podCreationTimestamp="2026-02-18 06:06:29 +0000 UTC" firstStartedPulling="2026-02-18 06:06:30.940328798 +0000 UTC m=+1088.109417030" lastFinishedPulling="2026-02-18 06:06:34.980586467 +0000 UTC m=+1092.149674699" observedRunningTime="2026-02-18 06:06:35.873186664 +0000 UTC m=+1093.042274896" watchObservedRunningTime="2026-02-18 06:06:35.887080352 +0000 UTC m=+1093.056168584" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.663329 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:47384->10.217.0.163:9311: read: connection reset by peer" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.663342 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:47382->10.217.0.163:9311: read: connection reset by peer" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.741458 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.741473 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56cbc85666-kbbjf" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.867500 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 18 06:06:37 crc kubenswrapper[4869]: E0218 06:06:37.868009 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="init" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868029 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="init" Feb 18 06:06:37 crc kubenswrapper[4869]: E0218 06:06:37.868044 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-api" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868050 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-api" Feb 18 06:06:37 crc kubenswrapper[4869]: E0218 06:06:37.868081 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="dnsmasq-dns" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868087 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="dnsmasq-dns" Feb 18 06:06:37 crc kubenswrapper[4869]: E0218 06:06:37.868096 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868103 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868262 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-api" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868275 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f13a3e9-97c1-4eaa-a0fb-f449a201a542" containerName="neutron-httpd" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868286 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1467815e-0912-4dc2-b87d-4cab891b93b2" containerName="dnsmasq-dns" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.868869 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.875069 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wh679" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.875148 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.875476 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.912575 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerID="a4a68942bd2658b39f257bcc42ff5f9e3acb414817fdf61022d741486db2e479" exitCode=0 Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.912633 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerDied","Data":"a4a68942bd2658b39f257bcc42ff5f9e3acb414817fdf61022d741486db2e479"} Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.931445 4869 generic.go:334] "Generic (PLEG): container finished" podID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerID="2b674ef05b97689ad31861ae1b0ac85b1b20724d20ada4f458e120c9993aef35" exitCode=0 Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.931492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerDied","Data":"2b674ef05b97689ad31861ae1b0ac85b1b20724d20ada4f458e120c9993aef35"} Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.931517 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346e57be-67e8-41ae-8c35-8ca7be52d847","Type":"ContainerDied","Data":"014c1d0430ac54f5a64ff8e03626b8287d3024a004476f824a46212f8f01de46"} Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.931528 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014c1d0430ac54f5a64ff8e03626b8287d3024a004476f824a46212f8f01de46" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.936402 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:37 crc kubenswrapper[4869]: I0218 06:06:37.939301 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050339 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050549 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050605 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2pfq\" (UniqueName: \"kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050643 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050684 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.050781 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id\") pod \"346e57be-67e8-41ae-8c35-8ca7be52d847\" (UID: \"346e57be-67e8-41ae-8c35-8ca7be52d847\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.051025 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config-secret\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.051168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n7p8\" (UniqueName: \"kubernetes.io/projected/348bdc65-0fd0-4870-adfc-d0d69a51e762-kube-api-access-2n7p8\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.051222 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.051255 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-combined-ca-bundle\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.052637 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.060932 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts" (OuterVolumeSpecName: "scripts") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.060960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq" (OuterVolumeSpecName: "kube-api-access-g2pfq") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "kube-api-access-g2pfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.069921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.128616 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.139528 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153343 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-combined-ca-bundle\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153433 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config-secret\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153597 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n7p8\" (UniqueName: \"kubernetes.io/projected/348bdc65-0fd0-4870-adfc-d0d69a51e762-kube-api-access-2n7p8\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153685 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153696 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346e57be-67e8-41ae-8c35-8ca7be52d847-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153724 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153735 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2pfq\" (UniqueName: \"kubernetes.io/projected/346e57be-67e8-41ae-8c35-8ca7be52d847-kube-api-access-g2pfq\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.153770 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.154514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.172343 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-combined-ca-bundle\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.179075 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/348bdc65-0fd0-4870-adfc-d0d69a51e762-openstack-config-secret\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.189817 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data" (OuterVolumeSpecName: "config-data") pod "346e57be-67e8-41ae-8c35-8ca7be52d847" (UID: "346e57be-67e8-41ae-8c35-8ca7be52d847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.197185 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n7p8\" (UniqueName: \"kubernetes.io/projected/348bdc65-0fd0-4870-adfc-d0d69a51e762-kube-api-access-2n7p8\") pod \"openstackclient\" (UID: \"348bdc65-0fd0-4870-adfc-d0d69a51e762\") " pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.212755 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.213823 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b8768b96-mwv6g" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.254839 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle\") pod \"2d50f595-31ed-4c89-ad13-63ae638b83c0\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.254907 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs\") pod \"2d50f595-31ed-4c89-ad13-63ae638b83c0\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.254946 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888ql\" (UniqueName: \"kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql\") pod \"2d50f595-31ed-4c89-ad13-63ae638b83c0\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.255077 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data\") pod \"2d50f595-31ed-4c89-ad13-63ae638b83c0\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.255111 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom\") pod \"2d50f595-31ed-4c89-ad13-63ae638b83c0\" (UID: \"2d50f595-31ed-4c89-ad13-63ae638b83c0\") " Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.255662 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346e57be-67e8-41ae-8c35-8ca7be52d847-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.260898 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d50f595-31ed-4c89-ad13-63ae638b83c0" (UID: "2d50f595-31ed-4c89-ad13-63ae638b83c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.261202 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs" (OuterVolumeSpecName: "logs") pod "2d50f595-31ed-4c89-ad13-63ae638b83c0" (UID: "2d50f595-31ed-4c89-ad13-63ae638b83c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.264327 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql" (OuterVolumeSpecName: "kube-api-access-888ql") pod "2d50f595-31ed-4c89-ad13-63ae638b83c0" (UID: "2d50f595-31ed-4c89-ad13-63ae638b83c0"). InnerVolumeSpecName "kube-api-access-888ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.264450 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.347694 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d50f595-31ed-4c89-ad13-63ae638b83c0" (UID: "2d50f595-31ed-4c89-ad13-63ae638b83c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.357018 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.357049 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.357059 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d50f595-31ed-4c89-ad13-63ae638b83c0-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.357069 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888ql\" (UniqueName: \"kubernetes.io/projected/2d50f595-31ed-4c89-ad13-63ae638b83c0-kube-api-access-888ql\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.367696 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.367973 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6558576dd4-9gx96" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-log" containerID="cri-o://24bd6d819bf8cd5bfda9ebdaec36688778e6e8216b265ef48658d29cfda6cb60" gracePeriod=30 Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.368114 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6558576dd4-9gx96" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-api" containerID="cri-o://0610aa1c580eda1201bb0355e4a294c7c1ed25aed49c83ad5fd9185ce11a18b7" gracePeriod=30 Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.409896 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data" (OuterVolumeSpecName: "config-data") pod "2d50f595-31ed-4c89-ad13-63ae638b83c0" (UID: "2d50f595-31ed-4c89-ad13-63ae638b83c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.459498 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d50f595-31ed-4c89-ad13-63ae638b83c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.837912 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.943951 4869 generic.go:334] "Generic (PLEG): container finished" podID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerID="24bd6d819bf8cd5bfda9ebdaec36688778e6e8216b265ef48658d29cfda6cb60" exitCode=143 Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.944024 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerDied","Data":"24bd6d819bf8cd5bfda9ebdaec36688778e6e8216b265ef48658d29cfda6cb60"} Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.947167 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56cbc85666-kbbjf" event={"ID":"2d50f595-31ed-4c89-ad13-63ae638b83c0","Type":"ContainerDied","Data":"fd7d0da666cf9f41da86d769806f8e2a79b393a2bc80813938a9605d160dbc3f"} Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.947309 4869 scope.go:117] "RemoveContainer" containerID="a4a68942bd2658b39f257bcc42ff5f9e3acb414817fdf61022d741486db2e479" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.947215 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56cbc85666-kbbjf" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.948656 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"348bdc65-0fd0-4870-adfc-d0d69a51e762","Type":"ContainerStarted","Data":"928c30f18531c027469300de0f50d0c064f87cb59e8300ccdfdfc17923f2b164"} Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.948840 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.979679 4869 scope.go:117] "RemoveContainer" containerID="8241cfb2e3663f67e7354d6c6c51b02d9b2c80557c89e590e2bfb85ce1063b0d" Feb 18 06:06:38 crc kubenswrapper[4869]: I0218 06:06:38.989540 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.007115 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.021812 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.032859 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56cbc85666-kbbjf"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.040866 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:39 crc kubenswrapper[4869]: E0218 06:06:39.041418 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041440 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" Feb 18 06:06:39 crc kubenswrapper[4869]: E0218 06:06:39.041466 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="cinder-scheduler" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041494 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="cinder-scheduler" Feb 18 06:06:39 crc kubenswrapper[4869]: E0218 06:06:39.041517 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="probe" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041527 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="probe" Feb 18 06:06:39 crc kubenswrapper[4869]: E0218 06:06:39.041550 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041559 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041820 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041846 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="probe" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041860 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" containerName="cinder-scheduler" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.041873 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" containerName="barbican-api-log" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.043277 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.057787 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.063354 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176113 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176176 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176232 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c911137c-7aa9-4875-ae81-e91caebd828a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176272 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176308 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7vx\" (UniqueName: \"kubernetes.io/projected/c911137c-7aa9-4875-ae81-e91caebd828a-kube-api-access-cg7vx\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.176352 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277605 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277681 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7vx\" (UniqueName: \"kubernetes.io/projected/c911137c-7aa9-4875-ae81-e91caebd828a-kube-api-access-cg7vx\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277731 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277782 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277813 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277870 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c911137c-7aa9-4875-ae81-e91caebd828a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.277959 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c911137c-7aa9-4875-ae81-e91caebd828a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.286243 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.287071 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.287662 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.288997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c911137c-7aa9-4875-ae81-e91caebd828a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.296406 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7vx\" (UniqueName: \"kubernetes.io/projected/c911137c-7aa9-4875-ae81-e91caebd828a-kube-api-access-cg7vx\") pod \"cinder-scheduler-0\" (UID: \"c911137c-7aa9-4875-ae81-e91caebd828a\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.407168 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.498879 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d50f595-31ed-4c89-ad13-63ae638b83c0" path="/var/lib/kubelet/pods/2d50f595-31ed-4c89-ad13-63ae638b83c0/volumes" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.499578 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346e57be-67e8-41ae-8c35-8ca7be52d847" path="/var/lib/kubelet/pods/346e57be-67e8-41ae-8c35-8ca7be52d847/volumes" Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.932301 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:39 crc kubenswrapper[4869]: I0218 06:06:39.960727 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c911137c-7aa9-4875-ae81-e91caebd828a","Type":"ContainerStarted","Data":"8efc21080740a4235a121603722d6ceef3e40b4e3961b2684821fca2bbd51ba8"} Feb 18 06:06:40 crc kubenswrapper[4869]: I0218 06:06:40.133020 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:06:40 crc kubenswrapper[4869]: I0218 06:06:40.133300 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:06:40 crc kubenswrapper[4869]: I0218 06:06:40.978083 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c911137c-7aa9-4875-ae81-e91caebd828a","Type":"ContainerStarted","Data":"6c7ddfd23e46ea5902d68f6976225b1fa123282841c4b55915fe066e013c039a"} Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.267925 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-846bf8ff8c-7j4wb"] Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.270114 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.273658 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.277905 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.279904 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.289884 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-846bf8ff8c-7j4wb"] Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.364698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckq62\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-kube-api-access-ckq62\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.364801 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-config-data\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.364826 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-run-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.364858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-internal-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.364904 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-public-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.370917 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-log-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.370966 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-combined-ca-bundle\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.371020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-etc-swift\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.473925 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckq62\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-kube-api-access-ckq62\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474010 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-config-data\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474032 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-run-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474060 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-internal-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474099 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-public-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474129 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-log-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474160 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-combined-ca-bundle\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.474197 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-etc-swift\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.475438 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-log-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.476868 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-run-httpd\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.491420 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-internal-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.492062 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-public-tls-certs\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.499721 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-config-data\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.501249 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckq62\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-kube-api-access-ckq62\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.504180 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-etc-swift\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.533669 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9d7e3-bcc7-493e-84a6-e646ab36e6f0-combined-ca-bundle\") pod \"swift-proxy-846bf8ff8c-7j4wb\" (UID: \"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0\") " pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.595678 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:41 crc kubenswrapper[4869]: I0218 06:06:41.998264 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c911137c-7aa9-4875-ae81-e91caebd828a","Type":"ContainerStarted","Data":"4f4423f547a660ce75919cc6f92e2013c17266405556bfb173ca0109b61fea8b"} Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.002612 4869 generic.go:334] "Generic (PLEG): container finished" podID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerID="0610aa1c580eda1201bb0355e4a294c7c1ed25aed49c83ad5fd9185ce11a18b7" exitCode=0 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.002676 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerDied","Data":"0610aa1c580eda1201bb0355e4a294c7c1ed25aed49c83ad5fd9185ce11a18b7"} Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.002710 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558576dd4-9gx96" event={"ID":"953c9b1f-e673-499d-a6bf-9a20e8d4e69e","Type":"ContainerDied","Data":"108392cc67f37203d82b4e56175a13dcda0e43b7e69dbbf296cf6108030f4cfb"} Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.002723 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108392cc67f37203d82b4e56175a13dcda0e43b7e69dbbf296cf6108030f4cfb" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.025926 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.025902599 podStartE2EDuration="4.025902599s" podCreationTimestamp="2026-02-18 06:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:42.020102339 +0000 UTC m=+1099.189190571" watchObservedRunningTime="2026-02-18 06:06:42.025902599 +0000 UTC m=+1099.194990831" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.046183 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq4nj\" (UniqueName: \"kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092676 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092757 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092804 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.092837 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.093561 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs" (OuterVolumeSpecName: "logs") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.096809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts\") pod \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\" (UID: \"953c9b1f-e673-499d-a6bf-9a20e8d4e69e\") " Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105223 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105567 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-central-agent" containerID="cri-o://75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873" gracePeriod=30 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105686 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="proxy-httpd" containerID="cri-o://98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1" gracePeriod=30 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105721 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="sg-core" containerID="cri-o://788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb" gracePeriod=30 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105797 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.105782 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-notification-agent" containerID="cri-o://260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b" gracePeriod=30 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.109849 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj" (OuterVolumeSpecName: "kube-api-access-nq4nj") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "kube-api-access-nq4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.120503 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts" (OuterVolumeSpecName: "scripts") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.198472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data" (OuterVolumeSpecName: "config-data") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.204057 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.210627 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.210671 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.210686 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.210698 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq4nj\" (UniqueName: \"kubernetes.io/projected/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-kube-api-access-nq4nj\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.247008 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.255101 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-846bf8ff8c-7j4wb"] Feb 18 06:06:42 crc kubenswrapper[4869]: W0218 06:06:42.260487 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23e9d7e3_bcc7_493e_84a6_e646ab36e6f0.slice/crio-9daf82ff6eedd76a9d43d4aa09af262b8dc2871e61f4413748e57662c713f7e4 WatchSource:0}: Error finding container 9daf82ff6eedd76a9d43d4aa09af262b8dc2871e61f4413748e57662c713f7e4: Status 404 returned error can't find the container with id 9daf82ff6eedd76a9d43d4aa09af262b8dc2871e61f4413748e57662c713f7e4 Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.278930 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "953c9b1f-e673-499d-a6bf-9a20e8d4e69e" (UID: "953c9b1f-e673-499d-a6bf-9a20e8d4e69e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.315528 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.315596 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953c9b1f-e673-499d-a6bf-9a20e8d4e69e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:42 crc kubenswrapper[4869]: I0218 06:06:42.474843 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.016237 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" event={"ID":"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0","Type":"ContainerStarted","Data":"9e03f22cb45b269e62a119fdfe6916cdef39d8a1ea11262aa4d745c5b29c2cb9"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.016654 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" event={"ID":"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0","Type":"ContainerStarted","Data":"51c37c34bf7523d04f3d14eda0529f158b0ef3dd95c3bb352f1e4da04cb5e4e1"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.016665 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" event={"ID":"23e9d7e3-bcc7-493e-84a6-e646ab36e6f0","Type":"ContainerStarted","Data":"9daf82ff6eedd76a9d43d4aa09af262b8dc2871e61f4413748e57662c713f7e4"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.017602 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.017697 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037501 4869 generic.go:334] "Generic (PLEG): container finished" podID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerID="98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1" exitCode=0 Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037538 4869 generic.go:334] "Generic (PLEG): container finished" podID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerID="788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb" exitCode=2 Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037545 4869 generic.go:334] "Generic (PLEG): container finished" podID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerID="75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873" exitCode=0 Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037817 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerDied","Data":"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037892 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerDied","Data":"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.037912 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerDied","Data":"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873"} Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.038040 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558576dd4-9gx96" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.041167 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" podStartSLOduration=2.041141748 podStartE2EDuration="2.041141748s" podCreationTimestamp="2026-02-18 06:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:43.035672905 +0000 UTC m=+1100.204761137" watchObservedRunningTime="2026-02-18 06:06:43.041141748 +0000 UTC m=+1100.210229980" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.090086 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.097005 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6558576dd4-9gx96"] Feb 18 06:06:43 crc kubenswrapper[4869]: E0218 06:06:43.110573 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod953c9b1f_e673_499d_a6bf_9a20e8d4e69e.slice\": RecentStats: unable to find data in memory cache]" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.272943 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.278601 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-log" containerID="cri-o://3b4990a9258d9c42d96c115fe51582a869ae9a39017962bb7dc0b390043f3429" gracePeriod=30 Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.279574 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-httpd" containerID="cri-o://4e71bf1b8b96f550532c65e6aae3ca3b030455e41b574f8de2ada1855c213874" gracePeriod=30 Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.484468 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" path="/var/lib/kubelet/pods/953c9b1f-e673-499d-a6bf-9a20e8d4e69e/volumes" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.742634 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.857595 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tnhc\" (UniqueName: \"kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.857655 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.857927 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.858054 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.858096 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.858133 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.858163 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml\") pod \"207fd9cd-d5eb-4821-974c-5744bbbd981b\" (UID: \"207fd9cd-d5eb-4821-974c-5744bbbd981b\") " Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.860459 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.862184 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.864913 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts" (OuterVolumeSpecName: "scripts") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.867309 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc" (OuterVolumeSpecName: "kube-api-access-9tnhc") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "kube-api-access-9tnhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.903664 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.960088 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.960123 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/207fd9cd-d5eb-4821-974c-5744bbbd981b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.960132 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.960143 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tnhc\" (UniqueName: \"kubernetes.io/projected/207fd9cd-d5eb-4821-974c-5744bbbd981b-kube-api-access-9tnhc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.960151 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:43 crc kubenswrapper[4869]: I0218 06:06:43.984942 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.029308 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data" (OuterVolumeSpecName: "config-data") pod "207fd9cd-d5eb-4821-974c-5744bbbd981b" (UID: "207fd9cd-d5eb-4821-974c-5744bbbd981b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.052487 4869 generic.go:334] "Generic (PLEG): container finished" podID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerID="3b4990a9258d9c42d96c115fe51582a869ae9a39017962bb7dc0b390043f3429" exitCode=143 Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.052610 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerDied","Data":"3b4990a9258d9c42d96c115fe51582a869ae9a39017962bb7dc0b390043f3429"} Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.056864 4869 generic.go:334] "Generic (PLEG): container finished" podID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerID="260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b" exitCode=0 Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.056993 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerDied","Data":"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b"} Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.057006 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.057175 4869 scope.go:117] "RemoveContainer" containerID="98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.059847 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"207fd9cd-d5eb-4821-974c-5744bbbd981b","Type":"ContainerDied","Data":"4d3fa2b8e806fcf13af1d2148077782b76f1d5f0ba0470ef686edd826cfef5dc"} Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.061391 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.061417 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fd9cd-d5eb-4821-974c-5744bbbd981b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.090183 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.090809 4869 scope.go:117] "RemoveContainer" containerID="788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.098864 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.110244 4869 scope.go:117] "RemoveContainer" containerID="260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.125555 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126006 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="proxy-httpd" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126026 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="proxy-httpd" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126045 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-notification-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126052 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-notification-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126067 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-central-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126073 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-central-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126080 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-api" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126085 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-api" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126104 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="sg-core" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126110 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="sg-core" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.126118 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-log" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126123 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-log" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126310 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-central-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126336 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="ceilometer-notification-agent" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126348 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-log" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126362 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="953c9b1f-e673-499d-a6bf-9a20e8d4e69e" containerName="placement-api" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126374 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="sg-core" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.126386 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" containerName="proxy-httpd" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.129453 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.135869 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.136077 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.140977 4869 scope.go:117] "RemoveContainer" containerID="75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.141289 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164621 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkl8t\" (UniqueName: \"kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164889 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164929 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164978 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.164999 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.165060 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.216493 4869 scope.go:117] "RemoveContainer" containerID="98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.217321 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1\": container with ID starting with 98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1 not found: ID does not exist" containerID="98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.217401 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1"} err="failed to get container status \"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1\": rpc error: code = NotFound desc = could not find container \"98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1\": container with ID starting with 98bdfeb997c971016d2a1b1f493bdb788aceebb93e8c727b3903add04c16fca1 not found: ID does not exist" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.217453 4869 scope.go:117] "RemoveContainer" containerID="788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.218575 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb\": container with ID starting with 788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb not found: ID does not exist" containerID="788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.218611 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb"} err="failed to get container status \"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb\": rpc error: code = NotFound desc = could not find container \"788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb\": container with ID starting with 788fa423653f9600123381245e5afb69ced12ef077dfd78321fbd98fb8a2c1cb not found: ID does not exist" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.218632 4869 scope.go:117] "RemoveContainer" containerID="260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.219371 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b\": container with ID starting with 260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b not found: ID does not exist" containerID="260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.219407 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b"} err="failed to get container status \"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b\": rpc error: code = NotFound desc = could not find container \"260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b\": container with ID starting with 260f02e0990e8df6d6e821a2fdf4a3d71bc60774b01bf203f11c6671c447026b not found: ID does not exist" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.219432 4869 scope.go:117] "RemoveContainer" containerID="75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873" Feb 18 06:06:44 crc kubenswrapper[4869]: E0218 06:06:44.225969 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873\": container with ID starting with 75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873 not found: ID does not exist" containerID="75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.226047 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873"} err="failed to get container status \"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873\": rpc error: code = NotFound desc = could not find container \"75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873\": container with ID starting with 75379434ada0a52f949c485cf893c5f7790e9079bd586a92916294a5ffff6873 not found: ID does not exist" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.271592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkl8t\" (UniqueName: \"kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.272247 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.272485 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.272646 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.272806 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.272945 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.273123 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.273603 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.273022 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.281677 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.281857 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.282347 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.282606 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.293399 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkl8t\" (UniqueName: \"kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t\") pod \"ceilometer-0\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.407984 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.472287 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.759524 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:44 crc kubenswrapper[4869]: I0218 06:06:44.768166 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Feb 18 06:06:45 crc kubenswrapper[4869]: I0218 06:06:45.066108 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerStarted","Data":"b125c2d9056277fa0a1b59122ccc7a3f4b74146f507ed3d2d164bfb278a48d3a"} Feb 18 06:06:45 crc kubenswrapper[4869]: I0218 06:06:45.480955 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207fd9cd-d5eb-4821-974c-5744bbbd981b" path="/var/lib/kubelet/pods/207fd9cd-d5eb-4821-974c-5744bbbd981b/volumes" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.091066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerStarted","Data":"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e"} Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.385075 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7qzlb"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.386382 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.398678 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7qzlb"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.415430 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.415503 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlrnz\" (UniqueName: \"kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.498622 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1cde-account-create-update-rzrgg"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.500480 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.506901 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.517196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.517257 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlrnz\" (UniqueName: \"kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.518671 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.521907 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1cde-account-create-update-rzrgg"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.539267 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlrnz\" (UniqueName: \"kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz\") pod \"nova-api-db-create-7qzlb\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.594809 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gc7td"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.597046 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.604659 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gc7td"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.619480 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.619649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl858\" (UniqueName: \"kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.619687 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.619728 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqcwj\" (UniqueName: \"kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.693086 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qssnr"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.694336 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.704380 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl858\" (UniqueName: \"kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721056 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721093 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqcwj\" (UniqueName: \"kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721118 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7sf\" (UniqueName: \"kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721149 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.721902 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.722609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.725954 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c893-account-create-update-c8z8g"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.728075 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.730541 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.757601 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qssnr"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.760625 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqcwj\" (UniqueName: \"kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj\") pod \"nova-cell0-db-create-gc7td\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.768184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl858\" (UniqueName: \"kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858\") pod \"nova-api-1cde-account-create-update-rzrgg\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.771222 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c893-account-create-update-c8z8g"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.823266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.823353 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7sf\" (UniqueName: \"kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.823423 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8wh\" (UniqueName: \"kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.823459 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.825413 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.830990 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.842935 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7sf\" (UniqueName: \"kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf\") pod \"nova-cell1-db-create-qssnr\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.902994 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4965-account-create-update-dgkm7"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.904231 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.909426 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.913131 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4965-account-create-update-dgkm7"] Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.925372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.925412 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8wh\" (UniqueName: \"kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.926010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.939069 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:46 crc kubenswrapper[4869]: I0218 06:06:46.946653 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8wh\" (UniqueName: \"kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh\") pod \"nova-cell0-c893-account-create-update-c8z8g\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.016104 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.027808 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.027898 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j825\" (UniqueName: \"kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.105551 4869 generic.go:334] "Generic (PLEG): container finished" podID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerID="4e71bf1b8b96f550532c65e6aae3ca3b030455e41b574f8de2ada1855c213874" exitCode=0 Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.105599 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerDied","Data":"4e71bf1b8b96f550532c65e6aae3ca3b030455e41b574f8de2ada1855c213874"} Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.130624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.130698 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j825\" (UniqueName: \"kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.131841 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.148061 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j825\" (UniqueName: \"kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825\") pod \"nova-cell1-4965-account-create-update-dgkm7\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.226158 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:47 crc kubenswrapper[4869]: I0218 06:06:47.246381 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:48 crc kubenswrapper[4869]: I0218 06:06:48.978288 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:48 crc kubenswrapper[4869]: I0218 06:06:48.978969 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-log" containerID="cri-o://df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0" gracePeriod=30 Feb 18 06:06:48 crc kubenswrapper[4869]: I0218 06:06:48.979059 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-httpd" containerID="cri-o://93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff" gracePeriod=30 Feb 18 06:06:49 crc kubenswrapper[4869]: I0218 06:06:49.127559 4869 generic.go:334] "Generic (PLEG): container finished" podID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerID="df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0" exitCode=143 Feb 18 06:06:49 crc kubenswrapper[4869]: I0218 06:06:49.127608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerDied","Data":"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0"} Feb 18 06:06:49 crc kubenswrapper[4869]: I0218 06:06:49.412136 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:49 crc kubenswrapper[4869]: I0218 06:06:49.677026 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 06:06:51 crc kubenswrapper[4869]: I0218 06:06:51.600361 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:51 crc kubenswrapper[4869]: I0218 06:06:51.606660 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-846bf8ff8c-7j4wb" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.326605 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359298 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359363 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7jc7\" (UniqueName: \"kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359501 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359550 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359622 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359646 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359668 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.359696 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"696bf351-11ee-47b1-bda4-8968aa32af8f\" (UID: \"696bf351-11ee-47b1-bda4-8968aa32af8f\") " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.360328 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.360693 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.360924 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs" (OuterVolumeSpecName: "logs") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.366021 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7" (OuterVolumeSpecName: "kube-api-access-k7jc7") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "kube-api-access-k7jc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.368067 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.373618 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts" (OuterVolumeSpecName: "scripts") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.420662 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.422302 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data" (OuterVolumeSpecName: "config-data") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.424247 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "696bf351-11ee-47b1-bda4-8968aa32af8f" (UID: "696bf351-11ee-47b1-bda4-8968aa32af8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462421 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462634 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462670 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462679 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462689 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7jc7\" (UniqueName: \"kubernetes.io/projected/696bf351-11ee-47b1-bda4-8968aa32af8f-kube-api-access-k7jc7\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462699 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696bf351-11ee-47b1-bda4-8968aa32af8f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.462707 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696bf351-11ee-47b1-bda4-8968aa32af8f-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.482094 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.560708 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gc7td"] Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.565047 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4869]: W0218 06:06:52.593593 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod436a64f9_ee1e_41cb_9db4_b918bc5c9d71.slice/crio-aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325 WatchSource:0}: Error finding container aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325: Status 404 returned error can't find the container with id aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325 Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.599811 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1cde-account-create-update-rzrgg"] Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.619800 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c893-account-create-update-c8z8g"] Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.756656 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7qzlb"] Feb 18 06:06:52 crc kubenswrapper[4869]: W0218 06:06:52.802389 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffaffd72_bfdb_4695_a882_14c5eb87ed33.slice/crio-c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8 WatchSource:0}: Error finding container c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8: Status 404 returned error can't find the container with id c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8 Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.968897 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qssnr"] Feb 18 06:06:52 crc kubenswrapper[4869]: I0218 06:06:52.990900 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4965-account-create-update-dgkm7"] Feb 18 06:06:53 crc kubenswrapper[4869]: W0218 06:06:53.007288 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f987f38_7e61_4316_8935_02a029937c98.slice/crio-0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934 WatchSource:0}: Error finding container 0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934: Status 404 returned error can't find the container with id 0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934 Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.061446 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.074896 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.074947 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075053 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075099 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075167 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075222 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rms\" (UniqueName: \"kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075329 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075369 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run\") pod \"88723829-b0c8-4bc7-92fc-63f9767ff69c\" (UID: \"88723829-b0c8-4bc7-92fc-63f9767ff69c\") " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.075692 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs" (OuterVolumeSpecName: "logs") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.076062 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.076317 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.076346 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88723829-b0c8-4bc7-92fc-63f9767ff69c-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.111801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.132608 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts" (OuterVolumeSpecName: "scripts") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.145912 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms" (OuterVolumeSpecName: "kube-api-access-n4rms") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "kube-api-access-n4rms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.184639 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.184672 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rms\" (UniqueName: \"kubernetes.io/projected/88723829-b0c8-4bc7-92fc-63f9767ff69c-kube-api-access-n4rms\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.184710 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.201123 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696bf351-11ee-47b1-bda4-8968aa32af8f","Type":"ContainerDied","Data":"2a91f1c5803221d9a5a58e7af6112f7172361e73d3504f3ad7f545af34a9a798"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.201181 4869 scope.go:117] "RemoveContainer" containerID="4e71bf1b8b96f550532c65e6aae3ca3b030455e41b574f8de2ada1855c213874" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.201495 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.207012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gc7td" event={"ID":"436a64f9-ee1e-41cb-9db4-b918bc5c9d71","Type":"ContainerStarted","Data":"aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.209056 4869 generic.go:334] "Generic (PLEG): container finished" podID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerID="93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff" exitCode=0 Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.209108 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerDied","Data":"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.209125 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88723829-b0c8-4bc7-92fc-63f9767ff69c","Type":"ContainerDied","Data":"226eef74f20f961341d7f242d8464557c1ba70f0448aee436e751beaba7c35a5"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.209134 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.210190 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" event={"ID":"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6","Type":"ContainerStarted","Data":"4a0897497a8c3d9c7d20d77f07567dc33e3298c0684f772037cb6b9e50436c64"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.213180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerStarted","Data":"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.214591 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7qzlb" event={"ID":"ffaffd72-bfdb-4695-a882-14c5eb87ed33","Type":"ContainerStarted","Data":"c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.215444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qssnr" event={"ID":"9f987f38-7e61-4316-8935-02a029937c98","Type":"ContainerStarted","Data":"0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.216580 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"348bdc65-0fd0-4870-adfc-d0d69a51e762","Type":"ContainerStarted","Data":"f25350fb5ec431768e04d6958d92f5d989630ad2af5546739a41eba31ea4896c"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.218361 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" event={"ID":"23568676-efc6-4e84-b939-94d530a055c0","Type":"ContainerStarted","Data":"2513e17375b466e0de7598dc07b8de41e4f48c77cf8a4c14f5cd452fa5f39a94"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.220584 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cde-account-create-update-rzrgg" event={"ID":"57fede98-3d1b-4596-baec-4d975793c9ea","Type":"ContainerStarted","Data":"a4dd5f34c368a34d07fa3102037fb65d600e730c5236e8600538f8bad1cf8dd3"} Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.227857 4869 scope.go:117] "RemoveContainer" containerID="3b4990a9258d9c42d96c115fe51582a869ae9a39017962bb7dc0b390043f3429" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.240023 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.235810745 podStartE2EDuration="16.240002594s" podCreationTimestamp="2026-02-18 06:06:37 +0000 UTC" firstStartedPulling="2026-02-18 06:06:38.851806207 +0000 UTC m=+1096.020894439" lastFinishedPulling="2026-02-18 06:06:51.855998056 +0000 UTC m=+1109.025086288" observedRunningTime="2026-02-18 06:06:53.235448582 +0000 UTC m=+1110.404536824" watchObservedRunningTime="2026-02-18 06:06:53.240002594 +0000 UTC m=+1110.409090826" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.274670 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.274961 4869 scope.go:117] "RemoveContainer" containerID="93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.288589 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.297873 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.298627 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.298647 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.298719 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.298729 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.298770 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.298781 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.298802 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.298809 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.299181 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.299203 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-httpd" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.299216 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.299226 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" containerName="glance-log" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.309431 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.309551 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.310916 4869 scope.go:117] "RemoveContainer" containerID="df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.312356 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.312529 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.324090 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.342847 4869 scope.go:117] "RemoveContainer" containerID="93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.346872 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff\": container with ID starting with 93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff not found: ID does not exist" containerID="93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.346915 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff"} err="failed to get container status \"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff\": rpc error: code = NotFound desc = could not find container \"93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff\": container with ID starting with 93cb7533a72cef118a3406d816719d3208d1e87c51e78c2f5ba7c7fe41f408ff not found: ID does not exist" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.346941 4869 scope.go:117] "RemoveContainer" containerID="df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.351933 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0\": container with ID starting with df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0 not found: ID does not exist" containerID="df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.351963 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0"} err="failed to get container status \"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0\": rpc error: code = NotFound desc = could not find container \"df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0\": container with ID starting with df96014092d7b3e13bd5aa91aa9627b3d1575583149af3492195154d5ffa5ed0 not found: ID does not exist" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.389818 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-config-data\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.389906 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcj97\" (UniqueName: \"kubernetes.io/projected/469d9354-4653-4a99-b457-3b453082e0e0-kube-api-access-bcj97\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.389953 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.389980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.390016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-logs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.390036 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.390063 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-scripts\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.390104 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.390176 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.451087 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 18 06:06:53 crc kubenswrapper[4869]: E0218 06:06:53.488887 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696bf351_11ee_47b1_bda4_8968aa32af8f.slice/crio-2a91f1c5803221d9a5a58e7af6112f7172361e73d3504f3ad7f545af34a9a798\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696bf351_11ee_47b1_bda4_8968aa32af8f.slice\": RecentStats: unable to find data in memory cache]" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.491023 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696bf351-11ee-47b1-bda4-8968aa32af8f" path="/var/lib/kubelet/pods/696bf351-11ee-47b1-bda4-8968aa32af8f/volumes" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492156 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-config-data\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492195 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcj97\" (UniqueName: \"kubernetes.io/projected/469d9354-4653-4a99-b457-3b453082e0e0-kube-api-access-bcj97\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492228 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492846 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492904 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-logs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492925 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.492952 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-scripts\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.493014 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.493311 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.494390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-logs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.494611 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/469d9354-4653-4a99-b457-3b453082e0e0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.499539 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-config-data\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.509601 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcj97\" (UniqueName: \"kubernetes.io/projected/469d9354-4653-4a99-b457-3b453082e0e0-kube-api-access-bcj97\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.512655 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.513293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.514481 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469d9354-4653-4a99-b457-3b453082e0e0-scripts\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.639891 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.652081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"469d9354-4653-4a99-b457-3b453082e0e0\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.666321 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data" (OuterVolumeSpecName: "config-data") pod "88723829-b0c8-4bc7-92fc-63f9767ff69c" (UID: "88723829-b0c8-4bc7-92fc-63f9767ff69c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.696522 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.696561 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88723829-b0c8-4bc7-92fc-63f9767ff69c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.850915 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.869561 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.892858 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.894402 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.904036 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.904349 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.916082 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:53 crc kubenswrapper[4869]: I0218 06:06:53.958287 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008769 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008823 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008875 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008903 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008930 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.008955 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5h9\" (UniqueName: \"kubernetes.io/projected/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-kube-api-access-rt5h9\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.009003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111228 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5h9\" (UniqueName: \"kubernetes.io/projected/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-kube-api-access-rt5h9\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111660 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111689 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111705 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111725 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.111766 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.112093 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.113117 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.113300 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.119552 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.119909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.121198 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.123853 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.143075 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5h9\" (UniqueName: \"kubernetes.io/projected/1dfb852b-63ab-46ce-8f5b-3c6be7b02400-kube-api-access-rt5h9\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.143436 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dfb852b-63ab-46ce-8f5b-3c6be7b02400\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.258087 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.258154 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerStarted","Data":"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.264304 4869 generic.go:334] "Generic (PLEG): container finished" podID="ffaffd72-bfdb-4695-a882-14c5eb87ed33" containerID="3ae2a6be93a6c0257b9742278d7f4f254a02195f06fcb53dfe1270d0d250f04e" exitCode=0 Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.264383 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7qzlb" event={"ID":"ffaffd72-bfdb-4695-a882-14c5eb87ed33","Type":"ContainerDied","Data":"3ae2a6be93a6c0257b9742278d7f4f254a02195f06fcb53dfe1270d0d250f04e"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.268849 4869 generic.go:334] "Generic (PLEG): container finished" podID="436a64f9-ee1e-41cb-9db4-b918bc5c9d71" containerID="0a12f12b5ca041715d9cab710c1cac7c21e1226fa2c8c37fb622ba9b239851ca" exitCode=0 Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.268916 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gc7td" event={"ID":"436a64f9-ee1e-41cb-9db4-b918bc5c9d71","Type":"ContainerDied","Data":"0a12f12b5ca041715d9cab710c1cac7c21e1226fa2c8c37fb622ba9b239851ca"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.272078 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" event={"ID":"23568676-efc6-4e84-b939-94d530a055c0","Type":"ContainerStarted","Data":"1905007e2f0201bdb04c016b478b487ac60263424eaa13f6c572edffb2a3993c"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.281126 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" event={"ID":"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6","Type":"ContainerStarted","Data":"c9bd66304325dffb0e3c495168d3a63ceb35429a22f3f0d86ef6228204998c99"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.282852 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qssnr" event={"ID":"9f987f38-7e61-4316-8935-02a029937c98","Type":"ContainerStarted","Data":"1acbf0f1a84276f5446002adcd8fb78b38cd7b713982ff7f74b233637fc33927"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.286417 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cde-account-create-update-rzrgg" event={"ID":"57fede98-3d1b-4596-baec-4d975793c9ea","Type":"ContainerStarted","Data":"08d6666d7a42fc771192d0995f5f53cd020e18271debc9ec42ef79d28f489a5f"} Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.653066 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:54 crc kubenswrapper[4869]: W0218 06:06:54.653138 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469d9354_4653_4a99_b457_3b453082e0e0.slice/crio-3486e1c5856055d536552cc39715d57d109c036a44646bbc5392796b1a8c4a73 WatchSource:0}: Error finding container 3486e1c5856055d536552cc39715d57d109c036a44646bbc5392796b1a8c4a73: Status 404 returned error can't find the container with id 3486e1c5856055d536552cc39715d57d109c036a44646bbc5392796b1a8c4a73 Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.766731 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-69d999cf4d-drf2r" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Feb 18 06:06:54 crc kubenswrapper[4869]: I0218 06:06:54.766874 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.304872 4869 generic.go:334] "Generic (PLEG): container finished" podID="23568676-efc6-4e84-b939-94d530a055c0" containerID="1905007e2f0201bdb04c016b478b487ac60263424eaa13f6c572edffb2a3993c" exitCode=0 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.304976 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" event={"ID":"23568676-efc6-4e84-b939-94d530a055c0","Type":"ContainerDied","Data":"1905007e2f0201bdb04c016b478b487ac60263424eaa13f6c572edffb2a3993c"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.311957 4869 generic.go:334] "Generic (PLEG): container finished" podID="57fede98-3d1b-4596-baec-4d975793c9ea" containerID="08d6666d7a42fc771192d0995f5f53cd020e18271debc9ec42ef79d28f489a5f" exitCode=0 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.312264 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cde-account-create-update-rzrgg" event={"ID":"57fede98-3d1b-4596-baec-4d975793c9ea","Type":"ContainerDied","Data":"08d6666d7a42fc771192d0995f5f53cd020e18271debc9ec42ef79d28f489a5f"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.320284 4869 generic.go:334] "Generic (PLEG): container finished" podID="31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" containerID="c9bd66304325dffb0e3c495168d3a63ceb35429a22f3f0d86ef6228204998c99" exitCode=0 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.320350 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" event={"ID":"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6","Type":"ContainerDied","Data":"c9bd66304325dffb0e3c495168d3a63ceb35429a22f3f0d86ef6228204998c99"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.350840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"469d9354-4653-4a99-b457-3b453082e0e0","Type":"ContainerStarted","Data":"83db9870e586cd3e15dee704ac14617f78b819738353151af959bf22d94af85f"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.350898 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"469d9354-4653-4a99-b457-3b453082e0e0","Type":"ContainerStarted","Data":"3486e1c5856055d536552cc39715d57d109c036a44646bbc5392796b1a8c4a73"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.355954 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerStarted","Data":"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.356124 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-central-agent" containerID="cri-o://ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.356360 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.356451 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="proxy-httpd" containerID="cri-o://8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.356531 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-notification-agent" containerID="cri-o://9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.356570 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="sg-core" containerID="cri-o://d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.365181 4869 generic.go:334] "Generic (PLEG): container finished" podID="9f987f38-7e61-4316-8935-02a029937c98" containerID="1acbf0f1a84276f5446002adcd8fb78b38cd7b713982ff7f74b233637fc33927" exitCode=0 Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.366355 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qssnr" event={"ID":"9f987f38-7e61-4316-8935-02a029937c98","Type":"ContainerDied","Data":"1acbf0f1a84276f5446002adcd8fb78b38cd7b713982ff7f74b233637fc33927"} Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.400236 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.163353492 podStartE2EDuration="11.400206211s" podCreationTimestamp="2026-02-18 06:06:44 +0000 UTC" firstStartedPulling="2026-02-18 06:06:44.770582179 +0000 UTC m=+1101.939670411" lastFinishedPulling="2026-02-18 06:06:55.007434898 +0000 UTC m=+1112.176523130" observedRunningTime="2026-02-18 06:06:55.385116764 +0000 UTC m=+1112.554204996" watchObservedRunningTime="2026-02-18 06:06:55.400206211 +0000 UTC m=+1112.569294443" Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.447182 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:55 crc kubenswrapper[4869]: W0218 06:06:55.464062 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dfb852b_63ab_46ce_8f5b_3c6be7b02400.slice/crio-c9d2c4f6a29b3f3c27f60ad6ad7aab373d2d1b9f8b9c376d107d34e891d53dad WatchSource:0}: Error finding container c9d2c4f6a29b3f3c27f60ad6ad7aab373d2d1b9f8b9c376d107d34e891d53dad: Status 404 returned error can't find the container with id c9d2c4f6a29b3f3c27f60ad6ad7aab373d2d1b9f8b9c376d107d34e891d53dad Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.492267 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88723829-b0c8-4bc7-92fc-63f9767ff69c" path="/var/lib/kubelet/pods/88723829-b0c8-4bc7-92fc-63f9767ff69c/volumes" Feb 18 06:06:55 crc kubenswrapper[4869]: I0218 06:06:55.887364 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.038020 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts\") pod \"23568676-efc6-4e84-b939-94d530a055c0\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.038083 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j825\" (UniqueName: \"kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825\") pod \"23568676-efc6-4e84-b939-94d530a055c0\" (UID: \"23568676-efc6-4e84-b939-94d530a055c0\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.038972 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23568676-efc6-4e84-b939-94d530a055c0" (UID: "23568676-efc6-4e84-b939-94d530a055c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.044619 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825" (OuterVolumeSpecName: "kube-api-access-8j825") pod "23568676-efc6-4e84-b939-94d530a055c0" (UID: "23568676-efc6-4e84-b939-94d530a055c0"). InnerVolumeSpecName "kube-api-access-8j825". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.071023 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.117524 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.126046 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.141087 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.143427 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23568676-efc6-4e84-b939-94d530a055c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.143451 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j825\" (UniqueName: \"kubernetes.io/projected/23568676-efc6-4e84-b939-94d530a055c0-kube-api-access-8j825\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.169105 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.244685 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts\") pod \"9f987f38-7e61-4316-8935-02a029937c98\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.244734 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts\") pod \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.244822 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7sf\" (UniqueName: \"kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf\") pod \"9f987f38-7e61-4316-8935-02a029937c98\" (UID: \"9f987f38-7e61-4316-8935-02a029937c98\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.244948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts\") pod \"57fede98-3d1b-4596-baec-4d975793c9ea\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.244980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqcwj\" (UniqueName: \"kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj\") pod \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\" (UID: \"436a64f9-ee1e-41cb-9db4-b918bc5c9d71\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.245008 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts\") pod \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.245050 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t8wh\" (UniqueName: \"kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh\") pod \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\" (UID: \"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.245070 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl858\" (UniqueName: \"kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858\") pod \"57fede98-3d1b-4596-baec-4d975793c9ea\" (UID: \"57fede98-3d1b-4596-baec-4d975793c9ea\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.245553 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "436a64f9-ee1e-41cb-9db4-b918bc5c9d71" (UID: "436a64f9-ee1e-41cb-9db4-b918bc5c9d71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.245933 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f987f38-7e61-4316-8935-02a029937c98" (UID: "9f987f38-7e61-4316-8935-02a029937c98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.246060 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" (UID: "31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.246652 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57fede98-3d1b-4596-baec-4d975793c9ea" (UID: "57fede98-3d1b-4596-baec-4d975793c9ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.247553 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57fede98-3d1b-4596-baec-4d975793c9ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.248242 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.248336 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f987f38-7e61-4316-8935-02a029937c98-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.248429 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh" (OuterVolumeSpecName: "kube-api-access-8t8wh") pod "31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" (UID: "31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6"). InnerVolumeSpecName "kube-api-access-8t8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.248475 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf" (OuterVolumeSpecName: "kube-api-access-lp7sf") pod "9f987f38-7e61-4316-8935-02a029937c98" (UID: "9f987f38-7e61-4316-8935-02a029937c98"). InnerVolumeSpecName "kube-api-access-lp7sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.249285 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.249547 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj" (OuterVolumeSpecName: "kube-api-access-zqcwj") pod "436a64f9-ee1e-41cb-9db4-b918bc5c9d71" (UID: "436a64f9-ee1e-41cb-9db4-b918bc5c9d71"). InnerVolumeSpecName "kube-api-access-zqcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.251908 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858" (OuterVolumeSpecName: "kube-api-access-gl858") pod "57fede98-3d1b-4596-baec-4d975793c9ea" (UID: "57fede98-3d1b-4596-baec-4d975793c9ea"). InnerVolumeSpecName "kube-api-access-gl858". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.352415 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts\") pod \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.352474 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlrnz\" (UniqueName: \"kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz\") pod \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\" (UID: \"ffaffd72-bfdb-4695-a882-14c5eb87ed33\") " Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.353108 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7sf\" (UniqueName: \"kubernetes.io/projected/9f987f38-7e61-4316-8935-02a029937c98-kube-api-access-lp7sf\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.353124 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqcwj\" (UniqueName: \"kubernetes.io/projected/436a64f9-ee1e-41cb-9db4-b918bc5c9d71-kube-api-access-zqcwj\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.353136 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t8wh\" (UniqueName: \"kubernetes.io/projected/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6-kube-api-access-8t8wh\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.353145 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl858\" (UniqueName: \"kubernetes.io/projected/57fede98-3d1b-4596-baec-4d975793c9ea-kube-api-access-gl858\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.353576 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffaffd72-bfdb-4695-a882-14c5eb87ed33" (UID: "ffaffd72-bfdb-4695-a882-14c5eb87ed33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.363180 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz" (OuterVolumeSpecName: "kube-api-access-wlrnz") pod "ffaffd72-bfdb-4695-a882-14c5eb87ed33" (UID: "ffaffd72-bfdb-4695-a882-14c5eb87ed33"). InnerVolumeSpecName "kube-api-access-wlrnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.377101 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1cde-account-create-update-rzrgg" event={"ID":"57fede98-3d1b-4596-baec-4d975793c9ea","Type":"ContainerDied","Data":"a4dd5f34c368a34d07fa3102037fb65d600e730c5236e8600538f8bad1cf8dd3"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.377154 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4dd5f34c368a34d07fa3102037fb65d600e730c5236e8600538f8bad1cf8dd3" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.377227 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1cde-account-create-update-rzrgg" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.389123 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" event={"ID":"31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6","Type":"ContainerDied","Data":"4a0897497a8c3d9c7d20d77f07567dc33e3298c0684f772037cb6b9e50436c64"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.389172 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a0897497a8c3d9c7d20d77f07567dc33e3298c0684f772037cb6b9e50436c64" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.389248 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c893-account-create-update-c8z8g" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.392030 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dfb852b-63ab-46ce-8f5b-3c6be7b02400","Type":"ContainerStarted","Data":"c9d2c4f6a29b3f3c27f60ad6ad7aab373d2d1b9f8b9c376d107d34e891d53dad"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399382 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerID="d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60" exitCode=2 Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399708 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerID="9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a" exitCode=0 Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399722 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerID="ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e" exitCode=0 Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerDied","Data":"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399835 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerDied","Data":"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.399850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerDied","Data":"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.405272 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7qzlb" event={"ID":"ffaffd72-bfdb-4695-a882-14c5eb87ed33","Type":"ContainerDied","Data":"c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.405307 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c648e7cb13524a022fd66245bf5f7606e97a04d278c286facec76cb4fd5119b8" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.405461 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7qzlb" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.414929 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qssnr" event={"ID":"9f987f38-7e61-4316-8935-02a029937c98","Type":"ContainerDied","Data":"0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.414974 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa3b3545590d0707d40825a3dfc32efe952e54cb04520e903d443e1ed8da934" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.414947 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qssnr" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.418642 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gc7td" event={"ID":"436a64f9-ee1e-41cb-9db4-b918bc5c9d71","Type":"ContainerDied","Data":"aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.418679 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebadc9c412690dd44bd385c2a10a84338eae83a97ef51f034529772093af325" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.418765 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gc7td" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.423302 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" event={"ID":"23568676-efc6-4e84-b939-94d530a055c0","Type":"ContainerDied","Data":"2513e17375b466e0de7598dc07b8de41e4f48c77cf8a4c14f5cd452fa5f39a94"} Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.423341 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2513e17375b466e0de7598dc07b8de41e4f48c77cf8a4c14f5cd452fa5f39a94" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.423392 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4965-account-create-update-dgkm7" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.449419 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d4594595c-zvnfb" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.455402 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffaffd72-bfdb-4695-a882-14c5eb87ed33-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.455445 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlrnz\" (UniqueName: \"kubernetes.io/projected/ffaffd72-bfdb-4695-a882-14c5eb87ed33-kube-api-access-wlrnz\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.562354 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.562909 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b5d66448d-sbn85" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-api" containerID="cri-o://46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483" gracePeriod=30 Feb 18 06:06:56 crc kubenswrapper[4869]: I0218 06:06:56.563509 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b5d66448d-sbn85" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-httpd" containerID="cri-o://b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd" gracePeriod=30 Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.434478 4869 generic.go:334] "Generic (PLEG): container finished" podID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerID="b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd" exitCode=0 Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.434567 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerDied","Data":"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd"} Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.438392 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"469d9354-4653-4a99-b457-3b453082e0e0","Type":"ContainerStarted","Data":"3d3aabbb7acfaa4df87b50afad370168618e2acb2ea0a59c313ef4688be3abf4"} Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.440919 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dfb852b-63ab-46ce-8f5b-3c6be7b02400","Type":"ContainerStarted","Data":"1d34b9a351c72168c54b2f613eed283d10b79afb04d985e817a17dd07454fb32"} Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.440983 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dfb852b-63ab-46ce-8f5b-3c6be7b02400","Type":"ContainerStarted","Data":"8f3613ffd126d8a1d537ebf4980a30268e480531bcd62e39a41012e01d620b98"} Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.482230 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.482209909 podStartE2EDuration="4.482209909s" podCreationTimestamp="2026-02-18 06:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:57.475075554 +0000 UTC m=+1114.644163786" watchObservedRunningTime="2026-02-18 06:06:57.482209909 +0000 UTC m=+1114.651298131" Feb 18 06:06:57 crc kubenswrapper[4869]: I0218 06:06:57.512840 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.512818812 podStartE2EDuration="4.512818812s" podCreationTimestamp="2026-02-18 06:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:57.504831208 +0000 UTC m=+1114.673919440" watchObservedRunningTime="2026-02-18 06:06:57.512818812 +0000 UTC m=+1114.681907044" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.451025 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.485624 4869 generic.go:334] "Generic (PLEG): container finished" podID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerID="fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b" exitCode=137 Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.485703 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d999cf4d-drf2r" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.487977 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerDied","Data":"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b"} Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.488026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d999cf4d-drf2r" event={"ID":"adfe77ee-719d-4b80-ae06-8a0a370cf7d2","Type":"ContainerDied","Data":"cc1e9d5ba551f30508b9c6b39a2577638a716d09b551509471801f38783ba271"} Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.488066 4869 scope.go:117] "RemoveContainer" containerID="6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568378 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568476 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568497 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568554 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rslv8\" (UniqueName: \"kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568710 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568818 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.568908 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data\") pod \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\" (UID: \"adfe77ee-719d-4b80-ae06-8a0a370cf7d2\") " Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.569084 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs" (OuterVolumeSpecName: "logs") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.569505 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.584999 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8" (OuterVolumeSpecName: "kube-api-access-rslv8") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "kube-api-access-rslv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.584942 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.598804 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data" (OuterVolumeSpecName: "config-data") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.602982 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.603549 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts" (OuterVolumeSpecName: "scripts") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.657721 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "adfe77ee-719d-4b80-ae06-8a0a370cf7d2" (UID: "adfe77ee-719d-4b80-ae06-8a0a370cf7d2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671040 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671082 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671094 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671104 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rslv8\" (UniqueName: \"kubernetes.io/projected/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-kube-api-access-rslv8\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671113 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.671123 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfe77ee-719d-4b80-ae06-8a0a370cf7d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.695946 4869 scope.go:117] "RemoveContainer" containerID="fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.721827 4869 scope.go:117] "RemoveContainer" containerID="6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c" Feb 18 06:07:01 crc kubenswrapper[4869]: E0218 06:07:01.722178 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c\": container with ID starting with 6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c not found: ID does not exist" containerID="6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.722210 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c"} err="failed to get container status \"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c\": rpc error: code = NotFound desc = could not find container \"6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c\": container with ID starting with 6db09d6ae8339e476ae31de77876ac066531920f74a0a3430c02d01110a5150c not found: ID does not exist" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.722231 4869 scope.go:117] "RemoveContainer" containerID="fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b" Feb 18 06:07:01 crc kubenswrapper[4869]: E0218 06:07:01.722877 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b\": container with ID starting with fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b not found: ID does not exist" containerID="fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.722905 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b"} err="failed to get container status \"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b\": rpc error: code = NotFound desc = could not find container \"fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b\": container with ID starting with fbe2d81ec4b4d1900d8d3f98fb8f0834cb3dfa672b3fcf5bb915d1b5dc3fc07b not found: ID does not exist" Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.867151 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:07:01 crc kubenswrapper[4869]: I0218 06:07:01.878888 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69d999cf4d-drf2r"] Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.003969 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77gnw"] Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.004536 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a64f9-ee1e-41cb-9db4-b918bc5c9d71" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.004631 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a64f9-ee1e-41cb-9db4-b918bc5c9d71" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.004686 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.004759 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.004826 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.004877 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.004931 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffaffd72-bfdb-4695-a882-14c5eb87ed33" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.004987 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffaffd72-bfdb-4695-a882-14c5eb87ed33" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.005045 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fede98-3d1b-4596-baec-4d975793c9ea" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005093 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fede98-3d1b-4596-baec-4d975793c9ea" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.005158 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f987f38-7e61-4316-8935-02a029937c98" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005209 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f987f38-7e61-4316-8935-02a029937c98" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.005298 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23568676-efc6-4e84-b939-94d530a055c0" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005367 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="23568676-efc6-4e84-b939-94d530a055c0" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.005425 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon-log" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005475 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon-log" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005697 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="23568676-efc6-4e84-b939-94d530a055c0" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005791 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon-log" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005850 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="436a64f9-ee1e-41cb-9db4-b918bc5c9d71" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005909 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" containerName="horizon" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.005973 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffaffd72-bfdb-4695-a882-14c5eb87ed33" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.006028 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fede98-3d1b-4596-baec-4d975793c9ea" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.006098 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f987f38-7e61-4316-8935-02a029937c98" containerName="mariadb-database-create" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.006179 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" containerName="mariadb-account-create-update" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.007096 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.012337 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.012587 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.012584 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mjh4x" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.019863 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77gnw"] Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.077106 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.077157 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvf56\" (UniqueName: \"kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.077249 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.077300 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.221287 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.221341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvf56\" (UniqueName: \"kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.221428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.221488 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.227516 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.238182 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.245357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvf56\" (UniqueName: \"kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.245563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77gnw\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.323205 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.329458 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.424639 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config\") pod \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.424849 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle\") pod \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.424875 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj7m6\" (UniqueName: \"kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6\") pod \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.424932 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs\") pod \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.425487 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config\") pod \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\" (UID: \"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f\") " Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.431712 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6" (OuterVolumeSpecName: "kube-api-access-gj7m6") pod "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" (UID: "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f"). InnerVolumeSpecName "kube-api-access-gj7m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.436712 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" (UID: "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.496105 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" (UID: "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.501041 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config" (OuterVolumeSpecName: "config") pod "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" (UID: "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.517039 4869 generic.go:334] "Generic (PLEG): container finished" podID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerID="46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483" exitCode=0 Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.517085 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerDied","Data":"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483"} Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.517117 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b5d66448d-sbn85" event={"ID":"c0473766-4d01-4ddf-b722-e8a2a9c6eb4f","Type":"ContainerDied","Data":"c66984bbffcb6467dfd99cd0bf446e5694703e67cc795e3464dde67788fd6d63"} Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.517135 4869 scope.go:117] "RemoveContainer" containerID="b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.517223 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b5d66448d-sbn85" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.528208 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.528527 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.528539 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.528548 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj7m6\" (UniqueName: \"kubernetes.io/projected/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-kube-api-access-gj7m6\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.544328 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" (UID: "c0473766-4d01-4ddf-b722-e8a2a9c6eb4f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.564336 4869 scope.go:117] "RemoveContainer" containerID="46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.630832 4869 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.655279 4869 scope.go:117] "RemoveContainer" containerID="b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.655801 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd\": container with ID starting with b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd not found: ID does not exist" containerID="b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.655843 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd"} err="failed to get container status \"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd\": rpc error: code = NotFound desc = could not find container \"b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd\": container with ID starting with b5c7bf1250d0cc530ae366d625afed89128fc7c811ec074c78f9e7e23e8306fd not found: ID does not exist" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.655863 4869 scope.go:117] "RemoveContainer" containerID="46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483" Feb 18 06:07:02 crc kubenswrapper[4869]: E0218 06:07:02.656279 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483\": container with ID starting with 46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483 not found: ID does not exist" containerID="46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483" Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.656326 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483"} err="failed to get container status \"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483\": rpc error: code = NotFound desc = could not find container \"46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483\": container with ID starting with 46dcdbc8f8037700dc00414a2271dd253517c4a88d6d81555bd4952023c86483 not found: ID does not exist" Feb 18 06:07:02 crc kubenswrapper[4869]: W0218 06:07:02.808755 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c122b7_735c_4d51_96f4_7e7db134dece.slice/crio-16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b WatchSource:0}: Error finding container 16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b: Status 404 returned error can't find the container with id 16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.809340 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77gnw"] Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.888074 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:07:02 crc kubenswrapper[4869]: I0218 06:07:02.897700 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b5d66448d-sbn85"] Feb 18 06:07:03 crc kubenswrapper[4869]: I0218 06:07:03.491238 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adfe77ee-719d-4b80-ae06-8a0a370cf7d2" path="/var/lib/kubelet/pods/adfe77ee-719d-4b80-ae06-8a0a370cf7d2/volumes" Feb 18 06:07:03 crc kubenswrapper[4869]: I0218 06:07:03.492357 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" path="/var/lib/kubelet/pods/c0473766-4d01-4ddf-b722-e8a2a9c6eb4f/volumes" Feb 18 06:07:03 crc kubenswrapper[4869]: I0218 06:07:03.543463 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77gnw" event={"ID":"e4c122b7-735c-4d51-96f4-7e7db134dece","Type":"ContainerStarted","Data":"16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b"} Feb 18 06:07:03 crc kubenswrapper[4869]: I0218 06:07:03.959982 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:07:03 crc kubenswrapper[4869]: I0218 06:07:03.960026 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.007420 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.009471 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.258727 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.259056 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.290543 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.305698 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.558974 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.559031 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.559144 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:04 crc kubenswrapper[4869]: I0218 06:07:04.559228 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.571493 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.571774 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.571505 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.571798 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.624950 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.628756 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:07:06 crc kubenswrapper[4869]: I0218 06:07:06.852424 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:07 crc kubenswrapper[4869]: I0218 06:07:07.128459 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:10 crc kubenswrapper[4869]: I0218 06:07:10.132516 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:07:10 crc kubenswrapper[4869]: I0218 06:07:10.133155 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:07:12 crc kubenswrapper[4869]: I0218 06:07:12.642613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77gnw" event={"ID":"e4c122b7-735c-4d51-96f4-7e7db134dece","Type":"ContainerStarted","Data":"12ee1f62901035ef5681355d4239845d7a5bf88512761454c9b3a4b1304e0614"} Feb 18 06:07:12 crc kubenswrapper[4869]: I0218 06:07:12.669103 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-77gnw" podStartSLOduration=3.019658171 podStartE2EDuration="11.66907674s" podCreationTimestamp="2026-02-18 06:07:01 +0000 UTC" firstStartedPulling="2026-02-18 06:07:02.811196299 +0000 UTC m=+1119.980284531" lastFinishedPulling="2026-02-18 06:07:11.460614848 +0000 UTC m=+1128.629703100" observedRunningTime="2026-02-18 06:07:12.661160708 +0000 UTC m=+1129.830248940" watchObservedRunningTime="2026-02-18 06:07:12.66907674 +0000 UTC m=+1129.838164972" Feb 18 06:07:14 crc kubenswrapper[4869]: I0218 06:07:14.478200 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 06:07:22 crc kubenswrapper[4869]: I0218 06:07:22.754869 4869 generic.go:334] "Generic (PLEG): container finished" podID="e4c122b7-735c-4d51-96f4-7e7db134dece" containerID="12ee1f62901035ef5681355d4239845d7a5bf88512761454c9b3a4b1304e0614" exitCode=0 Feb 18 06:07:22 crc kubenswrapper[4869]: I0218 06:07:22.754934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77gnw" event={"ID":"e4c122b7-735c-4d51-96f4-7e7db134dece","Type":"ContainerDied","Data":"12ee1f62901035ef5681355d4239845d7a5bf88512761454c9b3a4b1304e0614"} Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.124773 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.179483 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts\") pod \"e4c122b7-735c-4d51-96f4-7e7db134dece\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.179526 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle\") pod \"e4c122b7-735c-4d51-96f4-7e7db134dece\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.179650 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data\") pod \"e4c122b7-735c-4d51-96f4-7e7db134dece\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.179698 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvf56\" (UniqueName: \"kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56\") pod \"e4c122b7-735c-4d51-96f4-7e7db134dece\" (UID: \"e4c122b7-735c-4d51-96f4-7e7db134dece\") " Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.185807 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts" (OuterVolumeSpecName: "scripts") pod "e4c122b7-735c-4d51-96f4-7e7db134dece" (UID: "e4c122b7-735c-4d51-96f4-7e7db134dece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.186036 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56" (OuterVolumeSpecName: "kube-api-access-bvf56") pod "e4c122b7-735c-4d51-96f4-7e7db134dece" (UID: "e4c122b7-735c-4d51-96f4-7e7db134dece"). InnerVolumeSpecName "kube-api-access-bvf56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.204884 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data" (OuterVolumeSpecName: "config-data") pod "e4c122b7-735c-4d51-96f4-7e7db134dece" (UID: "e4c122b7-735c-4d51-96f4-7e7db134dece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.224592 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c122b7-735c-4d51-96f4-7e7db134dece" (UID: "e4c122b7-735c-4d51-96f4-7e7db134dece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.281212 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.281276 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.281294 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c122b7-735c-4d51-96f4-7e7db134dece-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.281308 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvf56\" (UniqueName: \"kubernetes.io/projected/e4c122b7-735c-4d51-96f4-7e7db134dece-kube-api-access-bvf56\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.772107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77gnw" event={"ID":"e4c122b7-735c-4d51-96f4-7e7db134dece","Type":"ContainerDied","Data":"16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b"} Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.772489 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16a1fd4fd686b9c14731f2d665dc5f26ddeba956b0af47984149f2ee650a9f6b" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.772233 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77gnw" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861205 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:07:24 crc kubenswrapper[4869]: E0218 06:07:24.861601 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c122b7-735c-4d51-96f4-7e7db134dece" containerName="nova-cell0-conductor-db-sync" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861620 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c122b7-735c-4d51-96f4-7e7db134dece" containerName="nova-cell0-conductor-db-sync" Feb 18 06:07:24 crc kubenswrapper[4869]: E0218 06:07:24.861638 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-api" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861644 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-api" Feb 18 06:07:24 crc kubenswrapper[4869]: E0218 06:07:24.861656 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-httpd" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861662 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-httpd" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861840 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c122b7-735c-4d51-96f4-7e7db134dece" containerName="nova-cell0-conductor-db-sync" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861855 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-httpd" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.861874 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0473766-4d01-4ddf-b722-e8a2a9c6eb4f" containerName="neutron-api" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.862428 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.864191 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.865936 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mjh4x" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.877888 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.895777 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t688j\" (UniqueName: \"kubernetes.io/projected/eabb4471-00e8-4edb-9128-249b4057d5d7-kube-api-access-t688j\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.895982 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.896018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.997976 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.998242 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:24 crc kubenswrapper[4869]: I0218 06:07:24.998453 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t688j\" (UniqueName: \"kubernetes.io/projected/eabb4471-00e8-4edb-9128-249b4057d5d7-kube-api-access-t688j\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.002430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.004579 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eabb4471-00e8-4edb-9128-249b4057d5d7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.018336 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t688j\" (UniqueName: \"kubernetes.io/projected/eabb4471-00e8-4edb-9128-249b4057d5d7-kube-api-access-t688j\") pod \"nova-cell0-conductor-0\" (UID: \"eabb4471-00e8-4edb-9128-249b4057d5d7\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.197871 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.632827 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.679571 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713359 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713400 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713457 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713546 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkl8t\" (UniqueName: \"kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713587 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713705 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.713777 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts\") pod \"f2d4f178-d98c-45b8-9c71-95d42a42093b\" (UID: \"f2d4f178-d98c-45b8-9c71-95d42a42093b\") " Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.717694 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts" (OuterVolumeSpecName: "scripts") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.719875 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t" (OuterVolumeSpecName: "kube-api-access-jkl8t") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "kube-api-access-jkl8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.720275 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.720886 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.747238 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.784284 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerID="8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2" exitCode=137 Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.784387 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerDied","Data":"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2"} Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.784419 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d4f178-d98c-45b8-9c71-95d42a42093b","Type":"ContainerDied","Data":"b125c2d9056277fa0a1b59122ccc7a3f4b74146f507ed3d2d164bfb278a48d3a"} Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.784465 4869 scope.go:117] "RemoveContainer" containerID="8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.784818 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.796737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eabb4471-00e8-4edb-9128-249b4057d5d7","Type":"ContainerStarted","Data":"c01364f5f13c3e8b4706087afc2a43f9e06d35bc95ba28052abf0a2f3970304b"} Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.804441 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.814857 4869 scope.go:117] "RemoveContainer" containerID="d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816607 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816634 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816644 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816654 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkl8t\" (UniqueName: \"kubernetes.io/projected/f2d4f178-d98c-45b8-9c71-95d42a42093b-kube-api-access-jkl8t\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816663 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.816672 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d4f178-d98c-45b8-9c71-95d42a42093b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.827690 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data" (OuterVolumeSpecName: "config-data") pod "f2d4f178-d98c-45b8-9c71-95d42a42093b" (UID: "f2d4f178-d98c-45b8-9c71-95d42a42093b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.857374 4869 scope.go:117] "RemoveContainer" containerID="9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.879606 4869 scope.go:117] "RemoveContainer" containerID="ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.908109 4869 scope.go:117] "RemoveContainer" containerID="8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2" Feb 18 06:07:25 crc kubenswrapper[4869]: E0218 06:07:25.908671 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2\": container with ID starting with 8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2 not found: ID does not exist" containerID="8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.908764 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2"} err="failed to get container status \"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2\": rpc error: code = NotFound desc = could not find container \"8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2\": container with ID starting with 8d94ae41edc613878eb9622a479dd61f5f477bbd27c931482752655bbb90dbc2 not found: ID does not exist" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.908795 4869 scope.go:117] "RemoveContainer" containerID="d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60" Feb 18 06:07:25 crc kubenswrapper[4869]: E0218 06:07:25.909481 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60\": container with ID starting with d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60 not found: ID does not exist" containerID="d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.909512 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60"} err="failed to get container status \"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60\": rpc error: code = NotFound desc = could not find container \"d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60\": container with ID starting with d9310aa229951f1931fd4ba1981a8f8045bddbfc715bc036dbd53df6c764ad60 not found: ID does not exist" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.909543 4869 scope.go:117] "RemoveContainer" containerID="9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a" Feb 18 06:07:25 crc kubenswrapper[4869]: E0218 06:07:25.909865 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a\": container with ID starting with 9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a not found: ID does not exist" containerID="9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.909900 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a"} err="failed to get container status \"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a\": rpc error: code = NotFound desc = could not find container \"9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a\": container with ID starting with 9224fb62c87f460b5072f11eb2a62d4746660cc5f73f36387e91c9d5b6ff323a not found: ID does not exist" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.909920 4869 scope.go:117] "RemoveContainer" containerID="ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e" Feb 18 06:07:25 crc kubenswrapper[4869]: E0218 06:07:25.910352 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e\": container with ID starting with ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e not found: ID does not exist" containerID="ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.910378 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e"} err="failed to get container status \"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e\": rpc error: code = NotFound desc = could not find container \"ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e\": container with ID starting with ce945d401ed9c1523a72d9121ea71376ab58432cb527d1d112ad2b5e527b051e not found: ID does not exist" Feb 18 06:07:25 crc kubenswrapper[4869]: I0218 06:07:25.917866 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d4f178-d98c-45b8-9c71-95d42a42093b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.132059 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.142415 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.154312 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:26 crc kubenswrapper[4869]: E0218 06:07:26.154773 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-notification-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.154788 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-notification-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: E0218 06:07:26.154804 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="proxy-httpd" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.154812 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="proxy-httpd" Feb 18 06:07:26 crc kubenswrapper[4869]: E0218 06:07:26.154823 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="sg-core" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.154831 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="sg-core" Feb 18 06:07:26 crc kubenswrapper[4869]: E0218 06:07:26.154860 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-central-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.154869 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-central-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.155096 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-central-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.155117 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="sg-core" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.155133 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="proxy-httpd" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.155145 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" containerName="ceilometer-notification-agent" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.156932 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.159325 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.160299 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.178226 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227378 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227426 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227455 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227479 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxn8\" (UniqueName: \"kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227578 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227653 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.227682 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329752 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329770 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxn8\" (UniqueName: \"kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329892 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.329908 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.330389 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.330476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.334914 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.335968 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.337012 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.349661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.350298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxn8\" (UniqueName: \"kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8\") pod \"ceilometer-0\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.531282 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.806603 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eabb4471-00e8-4edb-9128-249b4057d5d7","Type":"ContainerStarted","Data":"84661fb2164a76f6f02ecf4fe11278d9e3b809d4802713f4325b5ae297ef1013"} Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.806711 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.834212 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.834194437 podStartE2EDuration="2.834194437s" podCreationTimestamp="2026-02-18 06:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:26.820243428 +0000 UTC m=+1143.989331680" watchObservedRunningTime="2026-02-18 06:07:26.834194437 +0000 UTC m=+1144.003282669" Feb 18 06:07:26 crc kubenswrapper[4869]: I0218 06:07:26.990107 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:27 crc kubenswrapper[4869]: I0218 06:07:27.480373 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d4f178-d98c-45b8-9c71-95d42a42093b" path="/var/lib/kubelet/pods/f2d4f178-d98c-45b8-9c71-95d42a42093b/volumes" Feb 18 06:07:27 crc kubenswrapper[4869]: I0218 06:07:27.816212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerStarted","Data":"773c3a27fc14f46353ac8c7d99f55590e806c491b0cd6ebba2495b49b7cb9ecf"} Feb 18 06:07:28 crc kubenswrapper[4869]: I0218 06:07:28.824525 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerStarted","Data":"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337"} Feb 18 06:07:29 crc kubenswrapper[4869]: I0218 06:07:29.832554 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerStarted","Data":"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465"} Feb 18 06:07:29 crc kubenswrapper[4869]: I0218 06:07:29.834158 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerStarted","Data":"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3"} Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.226459 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.709729 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-74k4h"] Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.711313 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.726586 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-74k4h"] Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.726843 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.735721 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.805226 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.805281 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdnl\" (UniqueName: \"kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.805490 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.805662 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.883393 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.885613 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.898595 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.906617 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908070 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908172 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908213 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908254 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8hk\" (UniqueName: \"kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908319 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdnl\" (UniqueName: \"kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.908360 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.938694 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.956284 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.962281 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdnl\" (UniqueName: \"kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:30 crc kubenswrapper[4869]: I0218 06:07:30.989352 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data\") pod \"nova-cell0-cell-mapping-74k4h\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.012115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.012175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8hk\" (UniqueName: \"kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.012251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.020795 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.032465 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.033842 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.035375 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.037341 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.043813 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.071402 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8hk\" (UniqueName: \"kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk\") pod \"nova-cell1-novncproxy-0\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.073255 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.122695 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.122759 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.122927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzxz\" (UniqueName: \"kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.123003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.123445 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.125222 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.127340 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.158583 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.179465 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.224548 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.224589 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.224660 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzxz\" (UniqueName: \"kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.224694 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.235766 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.237473 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.249218 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.249282 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.255379 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.258829 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.265483 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzxz\" (UniqueName: \"kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz\") pod \"nova-api-0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.286294 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328065 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328708 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6vx\" (UniqueName: \"kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328792 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328823 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328938 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.328969 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.329028 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.329048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxz5\" (UniqueName: \"kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.330976 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.344847 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.430955 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431027 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldtq\" (UniqueName: \"kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431086 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431107 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxz5\" (UniqueName: \"kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431123 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431142 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431178 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6vx\" (UniqueName: \"kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.431301 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.432073 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.437390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.438419 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.443215 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.451460 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.462904 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxz5\" (UniqueName: \"kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5\") pod \"nova-scheduler-0\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.465843 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6vx\" (UniqueName: \"kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx\") pod \"nova-metadata-0\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.508381 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535248 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535464 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.535482 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldtq\" (UniqueName: \"kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.536673 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.537202 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.537713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.538095 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.538166 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.538189 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.558289 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldtq\" (UniqueName: \"kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq\") pod \"dnsmasq-dns-757b4f8459-wrp7j\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.601015 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.747810 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.853612 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wbsdh"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.855022 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.860569 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.860835 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.896490 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wbsdh"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.900000 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerStarted","Data":"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4"} Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.904855 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:07:31 crc kubenswrapper[4869]: W0218 06:07:31.955194 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc931b7_a618_4dc7_b89e_7c516d699154.slice/crio-2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39 WatchSource:0}: Error finding container 2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39: Status 404 returned error can't find the container with id 2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39 Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.961533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.961762 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvldg\" (UniqueName: \"kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.961828 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.961874 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.984590 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-74k4h"] Feb 18 06:07:31 crc kubenswrapper[4869]: I0218 06:07:31.985694 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.909589005 podStartE2EDuration="5.985675174s" podCreationTimestamp="2026-02-18 06:07:26 +0000 UTC" firstStartedPulling="2026-02-18 06:07:26.993830245 +0000 UTC m=+1144.162918477" lastFinishedPulling="2026-02-18 06:07:31.069916414 +0000 UTC m=+1148.239004646" observedRunningTime="2026-02-18 06:07:31.959436128 +0000 UTC m=+1149.128524380" watchObservedRunningTime="2026-02-18 06:07:31.985675174 +0000 UTC m=+1149.154763396" Feb 18 06:07:31 crc kubenswrapper[4869]: W0218 06:07:31.986209 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1af28061_c2b3_4800_8435_a6c2ca8c7cf0.slice/crio-26892d4888d6fd94a61d78013e94e11c614ef2b490f0cf052a1088877152fff9 WatchSource:0}: Error finding container 26892d4888d6fd94a61d78013e94e11c614ef2b490f0cf052a1088877152fff9: Status 404 returned error can't find the container with id 26892d4888d6fd94a61d78013e94e11c614ef2b490f0cf052a1088877152fff9 Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.019982 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.035016 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.063818 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvldg\" (UniqueName: \"kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.063898 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.063931 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.064011 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.067783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.071660 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.083527 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.130612 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvldg\" (UniqueName: \"kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg\") pod \"nova-cell1-conductor-db-sync-wbsdh\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.204197 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.480587 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.488010 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.552736 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.788969 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wbsdh"] Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.916043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-74k4h" event={"ID":"8dc931b7-a618-4dc7-b89e-7c516d699154","Type":"ContainerStarted","Data":"422c5e067356db016952362b4780dcd593e8431517871458c0caa57e91ae6645"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.916086 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-74k4h" event={"ID":"8dc931b7-a618-4dc7-b89e-7c516d699154","Type":"ContainerStarted","Data":"2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.918368 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerStarted","Data":"26892d4888d6fd94a61d78013e94e11c614ef2b490f0cf052a1088877152fff9"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.924093 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"338060e9-0b03-460e-a0d8-28f78a5de526","Type":"ContainerStarted","Data":"a0004efd335d4fa444a524b7e474e7a02fb75756e6696de8f425af190fe35d79"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.932237 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-74k4h" podStartSLOduration=2.9322126429999997 podStartE2EDuration="2.932212643s" podCreationTimestamp="2026-02-18 06:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:32.92921521 +0000 UTC m=+1150.098303432" watchObservedRunningTime="2026-02-18 06:07:32.932212643 +0000 UTC m=+1150.101300885" Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.932456 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" event={"ID":"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a","Type":"ContainerStarted","Data":"8d671906394af9554af6e1eaa7d78c7eab2f4eee3a4003d777462411ddc912d2"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.943267 4869 generic.go:334] "Generic (PLEG): container finished" podID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerID="4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c" exitCode=0 Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.943320 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" event={"ID":"cbaac7cd-b5f2-40f5-8482-c29eafa37f95","Type":"ContainerDied","Data":"4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.943337 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" event={"ID":"cbaac7cd-b5f2-40f5-8482-c29eafa37f95","Type":"ContainerStarted","Data":"a326fbff0636845ea56e113ed2109900fa5cb07d1e2c80a6ae60d809c1629f3a"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.944625 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"040199a9-8498-4011-b50c-051616a21fff","Type":"ContainerStarted","Data":"eb9405e6590ade2755a61e81812e9ab7ba710a21b252025d0b2d7779cbc83335"} Feb 18 06:07:32 crc kubenswrapper[4869]: I0218 06:07:32.947242 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerStarted","Data":"ff74137a7982f320ca69d562214a63a3fe91d8e53b27d1b6075d5197e9c9b0f6"} Feb 18 06:07:33 crc kubenswrapper[4869]: I0218 06:07:33.955888 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" event={"ID":"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a","Type":"ContainerStarted","Data":"134374408ed805a961975ba78d69790f9ef78101020a32401c5e9979e6211de8"} Feb 18 06:07:33 crc kubenswrapper[4869]: I0218 06:07:33.959662 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" event={"ID":"cbaac7cd-b5f2-40f5-8482-c29eafa37f95","Type":"ContainerStarted","Data":"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1"} Feb 18 06:07:33 crc kubenswrapper[4869]: I0218 06:07:33.959903 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:33 crc kubenswrapper[4869]: I0218 06:07:33.977950 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" podStartSLOduration=2.9779265710000002 podStartE2EDuration="2.977926571s" podCreationTimestamp="2026-02-18 06:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:33.971047944 +0000 UTC m=+1151.140136176" watchObservedRunningTime="2026-02-18 06:07:33.977926571 +0000 UTC m=+1151.147014803" Feb 18 06:07:33 crc kubenswrapper[4869]: I0218 06:07:33.996847 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" podStartSLOduration=2.996826361 podStartE2EDuration="2.996826361s" podCreationTimestamp="2026-02-18 06:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:33.991936981 +0000 UTC m=+1151.161025223" watchObservedRunningTime="2026-02-18 06:07:33.996826361 +0000 UTC m=+1151.165914583" Feb 18 06:07:34 crc kubenswrapper[4869]: I0218 06:07:34.577528 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:07:34 crc kubenswrapper[4869]: I0218 06:07:34.610698 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.991346 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"040199a9-8498-4011-b50c-051616a21fff","Type":"ContainerStarted","Data":"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb"} Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.991571 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="040199a9-8498-4011-b50c-051616a21fff" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb" gracePeriod=30 Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.993692 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerStarted","Data":"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e"} Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.993723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerStarted","Data":"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2"} Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.993971 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-log" containerID="cri-o://d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" gracePeriod=30 Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.994389 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-metadata" containerID="cri-o://94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" gracePeriod=30 Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.999363 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerStarted","Data":"00aafa79c39d94c976f856d0830f96dc21f72d12704d823f3e40b70f8f5fc15f"} Feb 18 06:07:36 crc kubenswrapper[4869]: I0218 06:07:36.999397 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerStarted","Data":"5a7927188eea91807f91983863255b3e1ce74e9d7aa4ffbaee83f130ea3e46d5"} Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.004357 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"338060e9-0b03-460e-a0d8-28f78a5de526","Type":"ContainerStarted","Data":"ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37"} Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.022786 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.393280565 podStartE2EDuration="7.022737582s" podCreationTimestamp="2026-02-18 06:07:30 +0000 UTC" firstStartedPulling="2026-02-18 06:07:32.058515884 +0000 UTC m=+1149.227604116" lastFinishedPulling="2026-02-18 06:07:35.687972901 +0000 UTC m=+1152.857061133" observedRunningTime="2026-02-18 06:07:37.01729071 +0000 UTC m=+1154.186378962" watchObservedRunningTime="2026-02-18 06:07:37.022737582 +0000 UTC m=+1154.191825814" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.055537 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.371072045 podStartE2EDuration="7.055512968s" podCreationTimestamp="2026-02-18 06:07:30 +0000 UTC" firstStartedPulling="2026-02-18 06:07:32.013703495 +0000 UTC m=+1149.182791727" lastFinishedPulling="2026-02-18 06:07:35.698144418 +0000 UTC m=+1152.867232650" observedRunningTime="2026-02-18 06:07:37.043385304 +0000 UTC m=+1154.212473546" watchObservedRunningTime="2026-02-18 06:07:37.055512968 +0000 UTC m=+1154.224601200" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.072018 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.975316488 podStartE2EDuration="6.071998519s" podCreationTimestamp="2026-02-18 06:07:31 +0000 UTC" firstStartedPulling="2026-02-18 06:07:32.590645074 +0000 UTC m=+1149.759733296" lastFinishedPulling="2026-02-18 06:07:35.687327095 +0000 UTC m=+1152.856415327" observedRunningTime="2026-02-18 06:07:37.061285889 +0000 UTC m=+1154.230374111" watchObservedRunningTime="2026-02-18 06:07:37.071998519 +0000 UTC m=+1154.241086751" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.083439 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.910182394 podStartE2EDuration="6.083412346s" podCreationTimestamp="2026-02-18 06:07:31 +0000 UTC" firstStartedPulling="2026-02-18 06:07:32.514726679 +0000 UTC m=+1149.683814911" lastFinishedPulling="2026-02-18 06:07:35.687956631 +0000 UTC m=+1152.857044863" observedRunningTime="2026-02-18 06:07:37.078064856 +0000 UTC m=+1154.247153088" watchObservedRunningTime="2026-02-18 06:07:37.083412346 +0000 UTC m=+1154.252500588" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.577594 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.686338 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg6vx\" (UniqueName: \"kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx\") pod \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.686423 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle\") pod \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.686486 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs\") pod \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.686566 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data\") pod \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\" (UID: \"35c3bbd2-947c-45ba-a855-7bcd2649d34f\") " Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.687727 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs" (OuterVolumeSpecName: "logs") pod "35c3bbd2-947c-45ba-a855-7bcd2649d34f" (UID: "35c3bbd2-947c-45ba-a855-7bcd2649d34f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.700927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx" (OuterVolumeSpecName: "kube-api-access-kg6vx") pod "35c3bbd2-947c-45ba-a855-7bcd2649d34f" (UID: "35c3bbd2-947c-45ba-a855-7bcd2649d34f"). InnerVolumeSpecName "kube-api-access-kg6vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.723335 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data" (OuterVolumeSpecName: "config-data") pod "35c3bbd2-947c-45ba-a855-7bcd2649d34f" (UID: "35c3bbd2-947c-45ba-a855-7bcd2649d34f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.728079 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35c3bbd2-947c-45ba-a855-7bcd2649d34f" (UID: "35c3bbd2-947c-45ba-a855-7bcd2649d34f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.789192 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg6vx\" (UniqueName: \"kubernetes.io/projected/35c3bbd2-947c-45ba-a855-7bcd2649d34f-kube-api-access-kg6vx\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.789233 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.789242 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c3bbd2-947c-45ba-a855-7bcd2649d34f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4869]: I0218 06:07:37.789252 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3bbd2-947c-45ba-a855-7bcd2649d34f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014717 4869 generic.go:334] "Generic (PLEG): container finished" podID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerID="94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" exitCode=0 Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014807 4869 generic.go:334] "Generic (PLEG): container finished" podID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerID="d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" exitCode=143 Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014829 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerDied","Data":"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e"} Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014896 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerDied","Data":"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2"} Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014908 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35c3bbd2-947c-45ba-a855-7bcd2649d34f","Type":"ContainerDied","Data":"ff74137a7982f320ca69d562214a63a3fe91d8e53b27d1b6075d5197e9c9b0f6"} Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.014923 4869 scope.go:117] "RemoveContainer" containerID="94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.015776 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.050861 4869 scope.go:117] "RemoveContainer" containerID="d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.052036 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.064886 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.082660 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:38 crc kubenswrapper[4869]: E0218 06:07:38.083309 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-metadata" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.083320 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-metadata" Feb 18 06:07:38 crc kubenswrapper[4869]: E0218 06:07:38.083342 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-log" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.083348 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-log" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.083519 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-metadata" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.083532 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" containerName="nova-metadata-log" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.085570 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.089097 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.091617 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.094078 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.110084 4869 scope.go:117] "RemoveContainer" containerID="94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" Feb 18 06:07:38 crc kubenswrapper[4869]: E0218 06:07:38.111122 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e\": container with ID starting with 94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e not found: ID does not exist" containerID="94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.111153 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e"} err="failed to get container status \"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e\": rpc error: code = NotFound desc = could not find container \"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e\": container with ID starting with 94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e not found: ID does not exist" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.111181 4869 scope.go:117] "RemoveContainer" containerID="d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" Feb 18 06:07:38 crc kubenswrapper[4869]: E0218 06:07:38.113664 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2\": container with ID starting with d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2 not found: ID does not exist" containerID="d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.113712 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2"} err="failed to get container status \"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2\": rpc error: code = NotFound desc = could not find container \"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2\": container with ID starting with d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2 not found: ID does not exist" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.113756 4869 scope.go:117] "RemoveContainer" containerID="94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.114129 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e"} err="failed to get container status \"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e\": rpc error: code = NotFound desc = could not find container \"94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e\": container with ID starting with 94e3ed75dc022c5c14fc8f4426ba07fba8dea9dd6c4d1bee1f3b4f063707b45e not found: ID does not exist" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.114143 4869 scope.go:117] "RemoveContainer" containerID="d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.115398 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2"} err="failed to get container status \"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2\": rpc error: code = NotFound desc = could not find container \"d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2\": container with ID starting with d93a710db68d60fc56dfe0bb796d9889ba03845f5a1cc7924307065f3c58d6b2 not found: ID does not exist" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.195477 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.195571 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.195911 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.196004 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxv4t\" (UniqueName: \"kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.196087 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.297886 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.298261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.298337 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.298384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxv4t\" (UniqueName: \"kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.298453 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.300157 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.302496 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.303293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.309561 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.322699 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxv4t\" (UniqueName: \"kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t\") pod \"nova-metadata-0\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.437325 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:38 crc kubenswrapper[4869]: I0218 06:07:38.946571 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:39 crc kubenswrapper[4869]: I0218 06:07:39.029661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerStarted","Data":"ebaadb5e53a6865b1c1dffcb72533cdf4399f92fed54c8cbb48ae624c2961878"} Feb 18 06:07:39 crc kubenswrapper[4869]: I0218 06:07:39.490205 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c3bbd2-947c-45ba-a855-7bcd2649d34f" path="/var/lib/kubelet/pods/35c3bbd2-947c-45ba-a855-7bcd2649d34f/volumes" Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.041066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerStarted","Data":"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83"} Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.041117 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerStarted","Data":"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97"} Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.081889 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.081867992 podStartE2EDuration="2.081867992s" podCreationTimestamp="2026-02-18 06:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:40.072169666 +0000 UTC m=+1157.241257898" watchObservedRunningTime="2026-02-18 06:07:40.081867992 +0000 UTC m=+1157.250956224" Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.132566 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.132621 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.132661 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.133147 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:07:40 crc kubenswrapper[4869]: I0218 06:07:40.133204 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6" gracePeriod=600 Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.060660 4869 generic.go:334] "Generic (PLEG): container finished" podID="8dc931b7-a618-4dc7-b89e-7c516d699154" containerID="422c5e067356db016952362b4780dcd593e8431517871458c0caa57e91ae6645" exitCode=0 Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.061321 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-74k4h" event={"ID":"8dc931b7-a618-4dc7-b89e-7c516d699154","Type":"ContainerDied","Data":"422c5e067356db016952362b4780dcd593e8431517871458c0caa57e91ae6645"} Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.065560 4869 generic.go:334] "Generic (PLEG): container finished" podID="aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" containerID="134374408ed805a961975ba78d69790f9ef78101020a32401c5e9979e6211de8" exitCode=0 Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.065768 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" event={"ID":"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a","Type":"ContainerDied","Data":"134374408ed805a961975ba78d69790f9ef78101020a32401c5e9979e6211de8"} Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.076392 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6" exitCode=0 Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.076486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6"} Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.076529 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923"} Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.076661 4869 scope.go:117] "RemoveContainer" containerID="e9e47b16933a8107451e04ae8f93c9313979bca3d095548d99cb42d4297f33ca" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.180239 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.508685 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.508766 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.601962 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.602010 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.638441 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.750558 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.809106 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:07:41 crc kubenswrapper[4869]: I0218 06:07:41.809672 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="dnsmasq-dns" containerID="cri-o://86821b867917f19b5e7cf53b0d8f4e915c7e763184a8c5a5374ada8f15d21bb2" gracePeriod=10 Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.088159 4869 generic.go:334] "Generic (PLEG): container finished" podID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerID="86821b867917f19b5e7cf53b0d8f4e915c7e763184a8c5a5374ada8f15d21bb2" exitCode=0 Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.088203 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" event={"ID":"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4","Type":"ContainerDied","Data":"86821b867917f19b5e7cf53b0d8f4e915c7e763184a8c5a5374ada8f15d21bb2"} Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.132978 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.393011 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485521 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485582 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485618 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485724 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485773 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.485816 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48k6b\" (UniqueName: \"kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b\") pod \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\" (UID: \"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.493835 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b" (OuterVolumeSpecName: "kube-api-access-48k6b") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "kube-api-access-48k6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.574513 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.586083 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config" (OuterVolumeSpecName: "config") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.590277 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.590325 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.590337 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48k6b\" (UniqueName: \"kubernetes.io/projected/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-kube-api-access-48k6b\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.592879 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.592888 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.606175 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.606281 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.610765 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.627618 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.627638 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" (UID: "e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693064 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kdnl\" (UniqueName: \"kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl\") pod \"8dc931b7-a618-4dc7-b89e-7c516d699154\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693129 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle\") pod \"8dc931b7-a618-4dc7-b89e-7c516d699154\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693185 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data\") pod \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693221 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts\") pod \"8dc931b7-a618-4dc7-b89e-7c516d699154\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693252 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvldg\" (UniqueName: \"kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg\") pod \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693287 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data\") pod \"8dc931b7-a618-4dc7-b89e-7c516d699154\" (UID: \"8dc931b7-a618-4dc7-b89e-7c516d699154\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.693311 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle\") pod \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.694024 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts\") pod \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\" (UID: \"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a\") " Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.695415 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.695437 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.695448 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.697870 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts" (OuterVolumeSpecName: "scripts") pod "8dc931b7-a618-4dc7-b89e-7c516d699154" (UID: "8dc931b7-a618-4dc7-b89e-7c516d699154"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.699277 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl" (OuterVolumeSpecName: "kube-api-access-5kdnl") pod "8dc931b7-a618-4dc7-b89e-7c516d699154" (UID: "8dc931b7-a618-4dc7-b89e-7c516d699154"). InnerVolumeSpecName "kube-api-access-5kdnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.699426 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg" (OuterVolumeSpecName: "kube-api-access-dvldg") pod "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" (UID: "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a"). InnerVolumeSpecName "kube-api-access-dvldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.699943 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts" (OuterVolumeSpecName: "scripts") pod "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" (UID: "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.720135 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data" (OuterVolumeSpecName: "config-data") pod "8dc931b7-a618-4dc7-b89e-7c516d699154" (UID: "8dc931b7-a618-4dc7-b89e-7c516d699154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.720403 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" (UID: "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.724034 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data" (OuterVolumeSpecName: "config-data") pod "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" (UID: "aaf6dc38-fdba-4044-8f8b-b47d4d03db0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.728074 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc931b7-a618-4dc7-b89e-7c516d699154" (UID: "8dc931b7-a618-4dc7-b89e-7c516d699154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.797945 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798001 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kdnl\" (UniqueName: \"kubernetes.io/projected/8dc931b7-a618-4dc7-b89e-7c516d699154-kube-api-access-5kdnl\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798018 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798030 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798045 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798057 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvldg\" (UniqueName: \"kubernetes.io/projected/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-kube-api-access-dvldg\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798069 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:42 crc kubenswrapper[4869]: I0218 06:07:42.798081 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc931b7-a618-4dc7-b89e-7c516d699154-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.103249 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" event={"ID":"e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4","Type":"ContainerDied","Data":"6d00df454313bd1d26df965c9a2d38a92e94d749cf51013ec9eb3877ab5a8501"} Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.103310 4869 scope.go:117] "RemoveContainer" containerID="86821b867917f19b5e7cf53b0d8f4e915c7e763184a8c5a5374ada8f15d21bb2" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.103425 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qd9jb" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.106763 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-74k4h" event={"ID":"8dc931b7-a618-4dc7-b89e-7c516d699154","Type":"ContainerDied","Data":"2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39"} Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.106803 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c56e9372cc18781425bb7b7c095d160be4dbbd0b544b3dab3beb8c47e891c39" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.106868 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-74k4h" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.122025 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.123377 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wbsdh" event={"ID":"aaf6dc38-fdba-4044-8f8b-b47d4d03db0a","Type":"ContainerDied","Data":"8d671906394af9554af6e1eaa7d78c7eab2f4eee3a4003d777462411ddc912d2"} Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.123422 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d671906394af9554af6e1eaa7d78c7eab2f4eee3a4003d777462411ddc912d2" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.151397 4869 scope.go:117] "RemoveContainer" containerID="e33cc65acebee6f88041033d26c8405796e8012234ffbd3c6399768d8de24797" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.207341 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.235935 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qd9jb"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.245466 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:07:43 crc kubenswrapper[4869]: E0218 06:07:43.245966 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="dnsmasq-dns" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.245984 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="dnsmasq-dns" Feb 18 06:07:43 crc kubenswrapper[4869]: E0218 06:07:43.245995 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc931b7-a618-4dc7-b89e-7c516d699154" containerName="nova-manage" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246002 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc931b7-a618-4dc7-b89e-7c516d699154" containerName="nova-manage" Feb 18 06:07:43 crc kubenswrapper[4869]: E0218 06:07:43.246017 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="init" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246024 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="init" Feb 18 06:07:43 crc kubenswrapper[4869]: E0218 06:07:43.246035 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246041 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246205 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc931b7-a618-4dc7-b89e-7c516d699154" containerName="nova-manage" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246221 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" containerName="dnsmasq-dns" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246231 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.246880 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.252671 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.253672 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.286561 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.286904 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-log" containerID="cri-o://5a7927188eea91807f91983863255b3e1ce74e9d7aa4ffbaee83f130ea3e46d5" gracePeriod=30 Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.287496 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-api" containerID="cri-o://00aafa79c39d94c976f856d0830f96dc21f72d12704d823f3e40b70f8f5fc15f" gracePeriod=30 Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.316713 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.338575 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.341323 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-log" containerID="cri-o://683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" gracePeriod=30 Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.341391 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-metadata" containerID="cri-o://7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" gracePeriod=30 Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.410568 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.410701 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.410785 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrhfp\" (UniqueName: \"kubernetes.io/projected/4662bd21-e3c5-4980-bac5-dc1f76c958c3-kube-api-access-zrhfp\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.437907 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.437961 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.486473 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4" path="/var/lib/kubelet/pods/e5ed2006-0bdb-4fa9-a336-a6b5b7357ac4/volumes" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.512240 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.512338 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrhfp\" (UniqueName: \"kubernetes.io/projected/4662bd21-e3c5-4980-bac5-dc1f76c958c3-kube-api-access-zrhfp\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.512386 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.518308 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.521331 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4662bd21-e3c5-4980-bac5-dc1f76c958c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.529146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrhfp\" (UniqueName: \"kubernetes.io/projected/4662bd21-e3c5-4980-bac5-dc1f76c958c3-kube-api-access-zrhfp\") pod \"nova-cell1-conductor-0\" (UID: \"4662bd21-e3c5-4980-bac5-dc1f76c958c3\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.570451 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.747318 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.932544 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxv4t\" (UniqueName: \"kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t\") pod \"822b5889-38d9-4e39-b043-5c52152c6917\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.932612 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle\") pod \"822b5889-38d9-4e39-b043-5c52152c6917\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.932666 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data\") pod \"822b5889-38d9-4e39-b043-5c52152c6917\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.932719 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs\") pod \"822b5889-38d9-4e39-b043-5c52152c6917\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.932846 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs\") pod \"822b5889-38d9-4e39-b043-5c52152c6917\" (UID: \"822b5889-38d9-4e39-b043-5c52152c6917\") " Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.937049 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs" (OuterVolumeSpecName: "logs") pod "822b5889-38d9-4e39-b043-5c52152c6917" (UID: "822b5889-38d9-4e39-b043-5c52152c6917"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.943029 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t" (OuterVolumeSpecName: "kube-api-access-wxv4t") pod "822b5889-38d9-4e39-b043-5c52152c6917" (UID: "822b5889-38d9-4e39-b043-5c52152c6917"). InnerVolumeSpecName "kube-api-access-wxv4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.957894 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data" (OuterVolumeSpecName: "config-data") pod "822b5889-38d9-4e39-b043-5c52152c6917" (UID: "822b5889-38d9-4e39-b043-5c52152c6917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.960235 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "822b5889-38d9-4e39-b043-5c52152c6917" (UID: "822b5889-38d9-4e39-b043-5c52152c6917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:43 crc kubenswrapper[4869]: I0218 06:07:43.995351 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "822b5889-38d9-4e39-b043-5c52152c6917" (UID: "822b5889-38d9-4e39-b043-5c52152c6917"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.034937 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.034973 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/822b5889-38d9-4e39-b043-5c52152c6917-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.034983 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.034992 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxv4t\" (UniqueName: \"kubernetes.io/projected/822b5889-38d9-4e39-b043-5c52152c6917-kube-api-access-wxv4t\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.035006 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822b5889-38d9-4e39-b043-5c52152c6917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.055589 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:07:44 crc kubenswrapper[4869]: W0218 06:07:44.069420 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4662bd21_e3c5_4980_bac5_dc1f76c958c3.slice/crio-dc23f1c39823f54112d93a166dccd1fef1cf144bcd49fef4d4814b91a67df911 WatchSource:0}: Error finding container dc23f1c39823f54112d93a166dccd1fef1cf144bcd49fef4d4814b91a67df911: Status 404 returned error can't find the container with id dc23f1c39823f54112d93a166dccd1fef1cf144bcd49fef4d4814b91a67df911 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142580 4869 generic.go:334] "Generic (PLEG): container finished" podID="822b5889-38d9-4e39-b043-5c52152c6917" containerID="7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" exitCode=0 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142623 4869 generic.go:334] "Generic (PLEG): container finished" podID="822b5889-38d9-4e39-b043-5c52152c6917" containerID="683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" exitCode=143 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142682 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerDied","Data":"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83"} Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142719 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerDied","Data":"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97"} Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142735 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"822b5889-38d9-4e39-b043-5c52152c6917","Type":"ContainerDied","Data":"ebaadb5e53a6865b1c1dffcb72533cdf4399f92fed54c8cbb48ae624c2961878"} Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142776 4869 scope.go:117] "RemoveContainer" containerID="7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.142923 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.151225 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4662bd21-e3c5-4980-bac5-dc1f76c958c3","Type":"ContainerStarted","Data":"dc23f1c39823f54112d93a166dccd1fef1cf144bcd49fef4d4814b91a67df911"} Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.154975 4869 generic.go:334] "Generic (PLEG): container finished" podID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerID="5a7927188eea91807f91983863255b3e1ce74e9d7aa4ffbaee83f130ea3e46d5" exitCode=143 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.155145 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" containerName="nova-scheduler-scheduler" containerID="cri-o://ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" gracePeriod=30 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.155375 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerDied","Data":"5a7927188eea91807f91983863255b3e1ce74e9d7aa4ffbaee83f130ea3e46d5"} Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.183599 4869 scope.go:117] "RemoveContainer" containerID="683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.213476 4869 scope.go:117] "RemoveContainer" containerID="7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" Feb 18 06:07:44 crc kubenswrapper[4869]: E0218 06:07:44.214070 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83\": container with ID starting with 7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83 not found: ID does not exist" containerID="7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.214110 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83"} err="failed to get container status \"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83\": rpc error: code = NotFound desc = could not find container \"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83\": container with ID starting with 7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83 not found: ID does not exist" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.214135 4869 scope.go:117] "RemoveContainer" containerID="683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" Feb 18 06:07:44 crc kubenswrapper[4869]: E0218 06:07:44.214513 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97\": container with ID starting with 683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97 not found: ID does not exist" containerID="683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.214546 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97"} err="failed to get container status \"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97\": rpc error: code = NotFound desc = could not find container \"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97\": container with ID starting with 683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97 not found: ID does not exist" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.214562 4869 scope.go:117] "RemoveContainer" containerID="7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.215062 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83"} err="failed to get container status \"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83\": rpc error: code = NotFound desc = could not find container \"7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83\": container with ID starting with 7c4a11e77907552485dea7eb1a95b90312f34a1f37d0dbd2f7388429ca3d5c83 not found: ID does not exist" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.215133 4869 scope.go:117] "RemoveContainer" containerID="683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.215523 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97"} err="failed to get container status \"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97\": rpc error: code = NotFound desc = could not find container \"683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97\": container with ID starting with 683daeac288a6bf39e7831e4dee5d762444b7de8df3c2699865162e734076e97 not found: ID does not exist" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.216360 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.224017 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.238687 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:44 crc kubenswrapper[4869]: E0218 06:07:44.239077 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-log" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.239094 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-log" Feb 18 06:07:44 crc kubenswrapper[4869]: E0218 06:07:44.239116 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-metadata" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.239122 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-metadata" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.239322 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-metadata" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.239346 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="822b5889-38d9-4e39-b043-5c52152c6917" containerName="nova-metadata-log" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.240255 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.243727 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.244022 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.263999 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.339921 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.339987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.340063 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.340143 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxhd6\" (UniqueName: \"kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.340180 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.441822 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxhd6\" (UniqueName: \"kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.441881 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.441908 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.441944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.441989 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.442554 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.446196 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.446788 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.448055 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.462976 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxhd6\" (UniqueName: \"kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6\") pod \"nova-metadata-0\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.556288 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:07:44 crc kubenswrapper[4869]: W0218 06:07:44.988231 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79bdd2ba_f727_450b_a1ee_08dd5c68e84f.slice/crio-a525d04f1dea3b855d455276e4ffaa21fc705e0e900dd1bf5aa95ff2a5b2ab37 WatchSource:0}: Error finding container a525d04f1dea3b855d455276e4ffaa21fc705e0e900dd1bf5aa95ff2a5b2ab37: Status 404 returned error can't find the container with id a525d04f1dea3b855d455276e4ffaa21fc705e0e900dd1bf5aa95ff2a5b2ab37 Feb 18 06:07:44 crc kubenswrapper[4869]: I0218 06:07:44.988299 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:07:45 crc kubenswrapper[4869]: I0218 06:07:45.164047 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerStarted","Data":"a525d04f1dea3b855d455276e4ffaa21fc705e0e900dd1bf5aa95ff2a5b2ab37"} Feb 18 06:07:45 crc kubenswrapper[4869]: I0218 06:07:45.169196 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4662bd21-e3c5-4980-bac5-dc1f76c958c3","Type":"ContainerStarted","Data":"b689ce5a63c599fdd9b983893e7bbe62be87901e2f60b86279ae71aee1eed29b"} Feb 18 06:07:45 crc kubenswrapper[4869]: I0218 06:07:45.169361 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:45 crc kubenswrapper[4869]: I0218 06:07:45.489232 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822b5889-38d9-4e39-b043-5c52152c6917" path="/var/lib/kubelet/pods/822b5889-38d9-4e39-b043-5c52152c6917/volumes" Feb 18 06:07:46 crc kubenswrapper[4869]: I0218 06:07:46.180972 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerStarted","Data":"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68"} Feb 18 06:07:46 crc kubenswrapper[4869]: I0218 06:07:46.181015 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerStarted","Data":"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284"} Feb 18 06:07:46 crc kubenswrapper[4869]: I0218 06:07:46.204029 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.204009183 podStartE2EDuration="2.204009183s" podCreationTimestamp="2026-02-18 06:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:46.198485169 +0000 UTC m=+1163.367573441" watchObservedRunningTime="2026-02-18 06:07:46.204009183 +0000 UTC m=+1163.373097435" Feb 18 06:07:46 crc kubenswrapper[4869]: I0218 06:07:46.205209 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.205200733 podStartE2EDuration="3.205200733s" podCreationTimestamp="2026-02-18 06:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:45.189913504 +0000 UTC m=+1162.359001736" watchObservedRunningTime="2026-02-18 06:07:46.205200733 +0000 UTC m=+1163.374288975" Feb 18 06:07:46 crc kubenswrapper[4869]: E0218 06:07:46.604393 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:07:46 crc kubenswrapper[4869]: E0218 06:07:46.606143 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:07:46 crc kubenswrapper[4869]: E0218 06:07:46.607445 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:07:46 crc kubenswrapper[4869]: E0218 06:07:46.607481 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" containerName="nova-scheduler-scheduler" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.199545 4869 generic.go:334] "Generic (PLEG): container finished" podID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerID="00aafa79c39d94c976f856d0830f96dc21f72d12704d823f3e40b70f8f5fc15f" exitCode=0 Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.199620 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerDied","Data":"00aafa79c39d94c976f856d0830f96dc21f72d12704d823f3e40b70f8f5fc15f"} Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.201860 4869 generic.go:334] "Generic (PLEG): container finished" podID="338060e9-0b03-460e-a0d8-28f78a5de526" containerID="ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" exitCode=0 Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.201889 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"338060e9-0b03-460e-a0d8-28f78a5de526","Type":"ContainerDied","Data":"ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37"} Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.466205 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.527251 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data\") pod \"338060e9-0b03-460e-a0d8-28f78a5de526\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.527374 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlxz5\" (UniqueName: \"kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5\") pod \"338060e9-0b03-460e-a0d8-28f78a5de526\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.527403 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle\") pod \"338060e9-0b03-460e-a0d8-28f78a5de526\" (UID: \"338060e9-0b03-460e-a0d8-28f78a5de526\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.542100 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5" (OuterVolumeSpecName: "kube-api-access-hlxz5") pod "338060e9-0b03-460e-a0d8-28f78a5de526" (UID: "338060e9-0b03-460e-a0d8-28f78a5de526"). InnerVolumeSpecName "kube-api-access-hlxz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.584505 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data" (OuterVolumeSpecName: "config-data") pod "338060e9-0b03-460e-a0d8-28f78a5de526" (UID: "338060e9-0b03-460e-a0d8-28f78a5de526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.597731 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "338060e9-0b03-460e-a0d8-28f78a5de526" (UID: "338060e9-0b03-460e-a0d8-28f78a5de526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.629538 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.629640 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlxz5\" (UniqueName: \"kubernetes.io/projected/338060e9-0b03-460e-a0d8-28f78a5de526-kube-api-access-hlxz5\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.629658 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338060e9-0b03-460e-a0d8-28f78a5de526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.647800 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.730011 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtzxz\" (UniqueName: \"kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz\") pod \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.730184 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data\") pod \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.730233 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle\") pod \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.730272 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs\") pod \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\" (UID: \"1af28061-c2b3-4800-8435-a6c2ca8c7cf0\") " Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.731723 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs" (OuterVolumeSpecName: "logs") pod "1af28061-c2b3-4800-8435-a6c2ca8c7cf0" (UID: "1af28061-c2b3-4800-8435-a6c2ca8c7cf0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.738009 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz" (OuterVolumeSpecName: "kube-api-access-rtzxz") pod "1af28061-c2b3-4800-8435-a6c2ca8c7cf0" (UID: "1af28061-c2b3-4800-8435-a6c2ca8c7cf0"). InnerVolumeSpecName "kube-api-access-rtzxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.762658 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1af28061-c2b3-4800-8435-a6c2ca8c7cf0" (UID: "1af28061-c2b3-4800-8435-a6c2ca8c7cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.763567 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data" (OuterVolumeSpecName: "config-data") pod "1af28061-c2b3-4800-8435-a6c2ca8c7cf0" (UID: "1af28061-c2b3-4800-8435-a6c2ca8c7cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.831848 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.831885 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.831895 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtzxz\" (UniqueName: \"kubernetes.io/projected/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-kube-api-access-rtzxz\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:48 crc kubenswrapper[4869]: I0218 06:07:48.831906 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af28061-c2b3-4800-8435-a6c2ca8c7cf0-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.247657 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1af28061-c2b3-4800-8435-a6c2ca8c7cf0","Type":"ContainerDied","Data":"26892d4888d6fd94a61d78013e94e11c614ef2b490f0cf052a1088877152fff9"} Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.247804 4869 scope.go:117] "RemoveContainer" containerID="00aafa79c39d94c976f856d0830f96dc21f72d12704d823f3e40b70f8f5fc15f" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.248024 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.254008 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"338060e9-0b03-460e-a0d8-28f78a5de526","Type":"ContainerDied","Data":"a0004efd335d4fa444a524b7e474e7a02fb75756e6696de8f425af190fe35d79"} Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.254130 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.284034 4869 scope.go:117] "RemoveContainer" containerID="5a7927188eea91807f91983863255b3e1ce74e9d7aa4ffbaee83f130ea3e46d5" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.295490 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.313809 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.320349 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.330385 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.341532 4869 scope.go:117] "RemoveContainer" containerID="ca19f3f6229c0237377a4b72f5acf8aeea46af499069486ab62e7421276adb37" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.353817 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: E0218 06:07:49.354345 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-log" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354364 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-log" Feb 18 06:07:49 crc kubenswrapper[4869]: E0218 06:07:49.354397 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-api" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354406 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-api" Feb 18 06:07:49 crc kubenswrapper[4869]: E0218 06:07:49.354418 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" containerName="nova-scheduler-scheduler" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354427 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" containerName="nova-scheduler-scheduler" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354671 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-log" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354697 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" containerName="nova-scheduler-scheduler" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.354708 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" containerName="nova-api-api" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.355513 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.359802 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.360009 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.361352 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.362603 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.385043 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.403440 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.480020 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af28061-c2b3-4800-8435-a6c2ca8c7cf0" path="/var/lib/kubelet/pods/1af28061-c2b3-4800-8435-a6c2ca8c7cf0/volumes" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.480890 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338060e9-0b03-460e-a0d8-28f78a5de526" path="/var/lib/kubelet/pods/338060e9-0b03-460e-a0d8-28f78a5de526/volumes" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556730 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556842 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556865 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556887 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556915 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556939 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhn64\" (UniqueName: \"kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.556965 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhrd\" (UniqueName: \"kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.557062 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.557100 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.658940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659094 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659136 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659183 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659297 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhn64\" (UniqueName: \"kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.659342 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhrd\" (UniqueName: \"kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.660043 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.664813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.665196 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.667398 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.668533 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.680245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhn64\" (UniqueName: \"kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64\") pod \"nova-api-0\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.685112 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhrd\" (UniqueName: \"kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd\") pod \"nova-scheduler-0\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " pod="openstack/nova-scheduler-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.685848 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:07:49 crc kubenswrapper[4869]: I0218 06:07:49.971930 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:07:50 crc kubenswrapper[4869]: I0218 06:07:50.151204 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:07:50 crc kubenswrapper[4869]: I0218 06:07:50.264353 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerStarted","Data":"3a6eab19f05944941e3f3fef36d9001fb15375ca5b7328d4a702947d5cbe525d"} Feb 18 06:07:50 crc kubenswrapper[4869]: I0218 06:07:50.421453 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:07:50 crc kubenswrapper[4869]: W0218 06:07:50.446908 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066c0252_91a6_43e7_9164_3576a49ec5f8.slice/crio-8edbf11ef5c8e9542ddcf5e14cd07647ac25a91201b049f0c5ec444b2d683fce WatchSource:0}: Error finding container 8edbf11ef5c8e9542ddcf5e14cd07647ac25a91201b049f0c5ec444b2d683fce: Status 404 returned error can't find the container with id 8edbf11ef5c8e9542ddcf5e14cd07647ac25a91201b049f0c5ec444b2d683fce Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.276444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerStarted","Data":"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9"} Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.276730 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerStarted","Data":"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd"} Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.280703 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"066c0252-91a6-43e7-9164-3576a49ec5f8","Type":"ContainerStarted","Data":"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69"} Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.280786 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"066c0252-91a6-43e7-9164-3576a49ec5f8","Type":"ContainerStarted","Data":"8edbf11ef5c8e9542ddcf5e14cd07647ac25a91201b049f0c5ec444b2d683fce"} Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.306060 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.306043395 podStartE2EDuration="2.306043395s" podCreationTimestamp="2026-02-18 06:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:51.302048178 +0000 UTC m=+1168.471136410" watchObservedRunningTime="2026-02-18 06:07:51.306043395 +0000 UTC m=+1168.475131627" Feb 18 06:07:51 crc kubenswrapper[4869]: I0218 06:07:51.335686 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.335658344 podStartE2EDuration="2.335658344s" podCreationTimestamp="2026-02-18 06:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:51.323349768 +0000 UTC m=+1168.492438060" watchObservedRunningTime="2026-02-18 06:07:51.335658344 +0000 UTC m=+1168.504746606" Feb 18 06:07:53 crc kubenswrapper[4869]: I0218 06:07:53.607591 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 18 06:07:54 crc kubenswrapper[4869]: I0218 06:07:54.557053 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:07:54 crc kubenswrapper[4869]: I0218 06:07:54.558323 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:07:54 crc kubenswrapper[4869]: I0218 06:07:54.972012 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:07:55 crc kubenswrapper[4869]: I0218 06:07:55.567879 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:07:55 crc kubenswrapper[4869]: I0218 06:07:55.567886 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:07:56 crc kubenswrapper[4869]: I0218 06:07:56.669667 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 06:07:59 crc kubenswrapper[4869]: I0218 06:07:59.686590 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:07:59 crc kubenswrapper[4869]: I0218 06:07:59.687052 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:07:59 crc kubenswrapper[4869]: I0218 06:07:59.972731 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:07:59 crc kubenswrapper[4869]: I0218 06:07:59.999864 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.365029 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.365839 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="128630be-af69-4db6-bad0-59f17dc9dec0" containerName="kube-state-metrics" containerID="cri-o://258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6" gracePeriod=30 Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.414447 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.780855 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.781124 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.935423 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.967599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mshl\" (UniqueName: \"kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl\") pod \"128630be-af69-4db6-bad0-59f17dc9dec0\" (UID: \"128630be-af69-4db6-bad0-59f17dc9dec0\") " Feb 18 06:08:00 crc kubenswrapper[4869]: I0218 06:08:00.976065 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl" (OuterVolumeSpecName: "kube-api-access-9mshl") pod "128630be-af69-4db6-bad0-59f17dc9dec0" (UID: "128630be-af69-4db6-bad0-59f17dc9dec0"). InnerVolumeSpecName "kube-api-access-9mshl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.070252 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mshl\" (UniqueName: \"kubernetes.io/projected/128630be-af69-4db6-bad0-59f17dc9dec0-kube-api-access-9mshl\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.392220 4869 generic.go:334] "Generic (PLEG): container finished" podID="128630be-af69-4db6-bad0-59f17dc9dec0" containerID="258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6" exitCode=2 Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.392972 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.396792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"128630be-af69-4db6-bad0-59f17dc9dec0","Type":"ContainerDied","Data":"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6"} Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.396824 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"128630be-af69-4db6-bad0-59f17dc9dec0","Type":"ContainerDied","Data":"d04413323b249d5047f79747ce1a1897734459bfac4161d23c6ce5ca1db5463a"} Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.396841 4869 scope.go:117] "RemoveContainer" containerID="258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.422811 4869 scope.go:117] "RemoveContainer" containerID="258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6" Feb 18 06:08:01 crc kubenswrapper[4869]: E0218 06:08:01.423348 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6\": container with ID starting with 258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6 not found: ID does not exist" containerID="258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.423390 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6"} err="failed to get container status \"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6\": rpc error: code = NotFound desc = could not find container \"258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6\": container with ID starting with 258272a79799cbbf19c708f1c9836878f13b16b684680baad23a96582051cdb6 not found: ID does not exist" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.437377 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.446831 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.455869 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:01 crc kubenswrapper[4869]: E0218 06:08:01.456391 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128630be-af69-4db6-bad0-59f17dc9dec0" containerName="kube-state-metrics" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.456412 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="128630be-af69-4db6-bad0-59f17dc9dec0" containerName="kube-state-metrics" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.456769 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="128630be-af69-4db6-bad0-59f17dc9dec0" containerName="kube-state-metrics" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.457642 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.463005 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.467000 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.472986 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.484817 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.484884 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.484902 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rm6\" (UniqueName: \"kubernetes.io/projected/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-api-access-44rm6\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.485085 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.487961 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128630be-af69-4db6-bad0-59f17dc9dec0" path="/var/lib/kubelet/pods/128630be-af69-4db6-bad0-59f17dc9dec0/volumes" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.587224 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.587285 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.587307 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rm6\" (UniqueName: \"kubernetes.io/projected/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-api-access-44rm6\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.587412 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.590941 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.591288 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.592565 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b730-0094-40d2-9894-d90d45f8c2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.604170 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rm6\" (UniqueName: \"kubernetes.io/projected/ecd7b730-0094-40d2-9894-d90d45f8c2de-kube-api-access-44rm6\") pod \"kube-state-metrics-0\" (UID: \"ecd7b730-0094-40d2-9894-d90d45f8c2de\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:01 crc kubenswrapper[4869]: I0218 06:08:01.773510 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.313287 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.316888 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.402475 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ecd7b730-0094-40d2-9894-d90d45f8c2de","Type":"ContainerStarted","Data":"d38e54a4f597a530e196a3a7f3196168dc0969fa7987616bdb1e8954b24a3a2f"} Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.606897 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.607238 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-central-agent" containerID="cri-o://2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337" gracePeriod=30 Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.607373 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-notification-agent" containerID="cri-o://d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3" gracePeriod=30 Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.607397 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="sg-core" containerID="cri-o://7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465" gracePeriod=30 Feb 18 06:08:02 crc kubenswrapper[4869]: I0218 06:08:02.607515 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="proxy-httpd" containerID="cri-o://ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4" gracePeriod=30 Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412123 4869 generic.go:334] "Generic (PLEG): container finished" podID="2da880ed-7e25-4267-87cc-d093f502b847" containerID="ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4" exitCode=0 Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412481 4869 generic.go:334] "Generic (PLEG): container finished" podID="2da880ed-7e25-4267-87cc-d093f502b847" containerID="7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465" exitCode=2 Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412495 4869 generic.go:334] "Generic (PLEG): container finished" podID="2da880ed-7e25-4267-87cc-d093f502b847" containerID="2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337" exitCode=0 Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412220 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerDied","Data":"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4"} Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412563 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerDied","Data":"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465"} Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.412576 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerDied","Data":"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337"} Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.414252 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ecd7b730-0094-40d2-9894-d90d45f8c2de","Type":"ContainerStarted","Data":"99768e44949b959937595f06e974ee327cbd54c35a574f1538bdca488adc2609"} Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.414458 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 06:08:03 crc kubenswrapper[4869]: I0218 06:08:03.441165 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.982954346 podStartE2EDuration="2.441140102s" podCreationTimestamp="2026-02-18 06:08:01 +0000 UTC" firstStartedPulling="2026-02-18 06:08:02.31308909 +0000 UTC m=+1179.482177322" lastFinishedPulling="2026-02-18 06:08:02.771274846 +0000 UTC m=+1179.940363078" observedRunningTime="2026-02-18 06:08:03.43094594 +0000 UTC m=+1180.600034232" watchObservedRunningTime="2026-02-18 06:08:03.441140102 +0000 UTC m=+1180.610228374" Feb 18 06:08:04 crc kubenswrapper[4869]: I0218 06:08:04.560651 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:08:04 crc kubenswrapper[4869]: I0218 06:08:04.565861 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:08:04 crc kubenswrapper[4869]: I0218 06:08:04.588231 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:08:05 crc kubenswrapper[4869]: I0218 06:08:05.434329 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.397785 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.446726 4869 generic.go:334] "Generic (PLEG): container finished" podID="040199a9-8498-4011-b50c-051616a21fff" containerID="552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb" exitCode=137 Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.446790 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"040199a9-8498-4011-b50c-051616a21fff","Type":"ContainerDied","Data":"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb"} Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.446829 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.446850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"040199a9-8498-4011-b50c-051616a21fff","Type":"ContainerDied","Data":"eb9405e6590ade2755a61e81812e9ab7ba710a21b252025d0b2d7779cbc83335"} Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.446870 4869 scope.go:117] "RemoveContainer" containerID="552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.486590 4869 scope.go:117] "RemoveContainer" containerID="552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb" Feb 18 06:08:07 crc kubenswrapper[4869]: E0218 06:08:07.487459 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb\": container with ID starting with 552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb not found: ID does not exist" containerID="552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.487492 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb"} err="failed to get container status \"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb\": rpc error: code = NotFound desc = could not find container \"552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb\": container with ID starting with 552facee59a394c895ebf23a585386632c7f13a46c8a7a6232254cca28753acb not found: ID does not exist" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.509295 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle\") pod \"040199a9-8498-4011-b50c-051616a21fff\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.509417 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8hk\" (UniqueName: \"kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk\") pod \"040199a9-8498-4011-b50c-051616a21fff\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.509449 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data\") pod \"040199a9-8498-4011-b50c-051616a21fff\" (UID: \"040199a9-8498-4011-b50c-051616a21fff\") " Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.528123 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk" (OuterVolumeSpecName: "kube-api-access-fc8hk") pod "040199a9-8498-4011-b50c-051616a21fff" (UID: "040199a9-8498-4011-b50c-051616a21fff"). InnerVolumeSpecName "kube-api-access-fc8hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.549906 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "040199a9-8498-4011-b50c-051616a21fff" (UID: "040199a9-8498-4011-b50c-051616a21fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.550180 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data" (OuterVolumeSpecName: "config-data") pod "040199a9-8498-4011-b50c-051616a21fff" (UID: "040199a9-8498-4011-b50c-051616a21fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.612178 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.612614 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8hk\" (UniqueName: \"kubernetes.io/projected/040199a9-8498-4011-b50c-051616a21fff-kube-api-access-fc8hk\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.612639 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040199a9-8498-4011-b50c-051616a21fff-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.797916 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.828855 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.839222 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:07 crc kubenswrapper[4869]: E0218 06:08:07.839918 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040199a9-8498-4011-b50c-051616a21fff" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.839935 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="040199a9-8498-4011-b50c-051616a21fff" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.840167 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="040199a9-8498-4011-b50c-051616a21fff" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.841209 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.846971 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.847218 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.847345 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.856602 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.916991 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbj9\" (UniqueName: \"kubernetes.io/projected/50575a1a-5d98-4692-b1ca-d275e90c6fed-kube-api-access-sfbj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.917047 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.917131 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.917159 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.917208 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:07 crc kubenswrapper[4869]: I0218 06:08:07.964590 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.021002 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.021102 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndxn8\" (UniqueName: \"kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.022034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.022153 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.022255 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.022312 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.022357 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd\") pod \"2da880ed-7e25-4267-87cc-d093f502b847\" (UID: \"2da880ed-7e25-4267-87cc-d093f502b847\") " Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.023193 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.023287 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.023556 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbj9\" (UniqueName: \"kubernetes.io/projected/50575a1a-5d98-4692-b1ca-d275e90c6fed-kube-api-access-sfbj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.023629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.023792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.024494 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.025724 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.027429 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8" (OuterVolumeSpecName: "kube-api-access-ndxn8") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "kube-api-access-ndxn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.044168 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.046307 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbj9\" (UniqueName: \"kubernetes.io/projected/50575a1a-5d98-4692-b1ca-d275e90c6fed-kube-api-access-sfbj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.050301 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.050391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.055655 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50575a1a-5d98-4692-b1ca-d275e90c6fed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"50575a1a-5d98-4692-b1ca-d275e90c6fed\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.061240 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts" (OuterVolumeSpecName: "scripts") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.073778 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.106270 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125196 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndxn8\" (UniqueName: \"kubernetes.io/projected/2da880ed-7e25-4267-87cc-d093f502b847-kube-api-access-ndxn8\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125235 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125248 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125260 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125272 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.125301 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2da880ed-7e25-4267-87cc-d093f502b847-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.150568 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data" (OuterVolumeSpecName: "config-data") pod "2da880ed-7e25-4267-87cc-d093f502b847" (UID: "2da880ed-7e25-4267-87cc-d093f502b847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.166033 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.226780 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da880ed-7e25-4267-87cc-d093f502b847-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.458410 4869 generic.go:334] "Generic (PLEG): container finished" podID="2da880ed-7e25-4267-87cc-d093f502b847" containerID="d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3" exitCode=0 Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.458475 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerDied","Data":"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3"} Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.458503 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2da880ed-7e25-4267-87cc-d093f502b847","Type":"ContainerDied","Data":"773c3a27fc14f46353ac8c7d99f55590e806c491b0cd6ebba2495b49b7cb9ecf"} Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.458520 4869 scope.go:117] "RemoveContainer" containerID="ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.458529 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.495694 4869 scope.go:117] "RemoveContainer" containerID="7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.509891 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.520227 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.541471 4869 scope.go:117] "RemoveContainer" containerID="d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.544310 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.544723 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="proxy-httpd" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.544735 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="proxy-httpd" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.545256 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="sg-core" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545265 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="sg-core" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.545279 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-notification-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545285 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-notification-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.545309 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-central-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545315 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-central-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545491 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="proxy-httpd" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545506 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-central-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545528 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="sg-core" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.545538 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da880ed-7e25-4267-87cc-d093f502b847" containerName="ceilometer-notification-agent" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.548973 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.552803 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.555719 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.558659 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.562234 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.601054 4869 scope.go:117] "RemoveContainer" containerID="2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.610630 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.635911 4869 scope.go:117] "RemoveContainer" containerID="ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.636801 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.636861 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfqpw\" (UniqueName: \"kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.636887 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.636922 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.636969 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.637050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.637076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.637101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.667076 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4\": container with ID starting with ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4 not found: ID does not exist" containerID="ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.667475 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4"} err="failed to get container status \"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4\": rpc error: code = NotFound desc = could not find container \"ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4\": container with ID starting with ab80c8598c5df28b4fcdc603bf3722a1c06f868361661b25ab191b31c754a7b4 not found: ID does not exist" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.667510 4869 scope.go:117] "RemoveContainer" containerID="7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.667854 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465\": container with ID starting with 7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465 not found: ID does not exist" containerID="7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.667893 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465"} err="failed to get container status \"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465\": rpc error: code = NotFound desc = could not find container \"7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465\": container with ID starting with 7b94cd7c60ba828a168655f13c766ff8f47e1e654737d9b94a8f1786bf69e465 not found: ID does not exist" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.667912 4869 scope.go:117] "RemoveContainer" containerID="d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.676917 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3\": container with ID starting with d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3 not found: ID does not exist" containerID="d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.676962 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3"} err="failed to get container status \"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3\": rpc error: code = NotFound desc = could not find container \"d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3\": container with ID starting with d2e56b71a49f97756fad8968f47081cce78697d80b8fa86c03b2b129104577a3 not found: ID does not exist" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.676991 4869 scope.go:117] "RemoveContainer" containerID="2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337" Feb 18 06:08:08 crc kubenswrapper[4869]: E0218 06:08:08.677589 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337\": container with ID starting with 2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337 not found: ID does not exist" containerID="2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.677620 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337"} err="failed to get container status \"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337\": rpc error: code = NotFound desc = could not find container \"2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337\": container with ID starting with 2775c65506d434021fa41ea918c1a72c653426c64e1c829a543a0b4e13adf337 not found: ID does not exist" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739663 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739771 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqpw\" (UniqueName: \"kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739847 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739880 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739936 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.739953 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.740666 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.746312 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.753340 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.753697 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.759380 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.759894 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.761325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.773554 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqpw\" (UniqueName: \"kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw\") pod \"ceilometer-0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " pod="openstack/ceilometer-0" Feb 18 06:08:08 crc kubenswrapper[4869]: I0218 06:08:08.873764 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.164380 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.486083 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040199a9-8498-4011-b50c-051616a21fff" path="/var/lib/kubelet/pods/040199a9-8498-4011-b50c-051616a21fff/volumes" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.486822 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da880ed-7e25-4267-87cc-d093f502b847" path="/var/lib/kubelet/pods/2da880ed-7e25-4267-87cc-d093f502b847/volumes" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.487454 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"50575a1a-5d98-4692-b1ca-d275e90c6fed","Type":"ContainerStarted","Data":"0c0d6ade6d513287fd601a4aa8759259b4aaeb7b039ea9856527168061f3f882"} Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.487480 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"50575a1a-5d98-4692-b1ca-d275e90c6fed","Type":"ContainerStarted","Data":"fb87638081b0507d5d2dcf1bed3cc5ff82c0a94e8ece5825f83be5207a04cf04"} Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.488398 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerStarted","Data":"e348620c24c453a462f84689839f71c68868427662e26758e146dfac845d20ff"} Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.516631 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5157302 podStartE2EDuration="2.5157302s" podCreationTimestamp="2026-02-18 06:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:09.511334936 +0000 UTC m=+1186.680423178" watchObservedRunningTime="2026-02-18 06:08:09.5157302 +0000 UTC m=+1186.684818452" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.690097 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.690728 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.691060 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:08:09 crc kubenswrapper[4869]: I0218 06:08:09.694209 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.500026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerStarted","Data":"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7"} Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.500442 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.500460 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerStarted","Data":"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45"} Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.505076 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.702064 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.705327 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.709990 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.796858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.797044 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm8d4\" (UniqueName: \"kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.797182 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.797620 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.797819 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.798096 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901162 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm8d4\" (UniqueName: \"kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901518 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.901853 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.902729 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.902778 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.903337 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.903524 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.903914 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:10 crc kubenswrapper[4869]: I0218 06:08:10.925713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm8d4\" (UniqueName: \"kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4\") pod \"dnsmasq-dns-89c5cd4d5-hz2g7\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:11 crc kubenswrapper[4869]: I0218 06:08:11.050100 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:11 crc kubenswrapper[4869]: I0218 06:08:11.511670 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerStarted","Data":"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91"} Feb 18 06:08:11 crc kubenswrapper[4869]: I0218 06:08:11.575158 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:08:11 crc kubenswrapper[4869]: W0218 06:08:11.577123 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3515130_3f9f_42ab_8bc7_6d357e1d645a.slice/crio-b87cdedd15337500e74b18d0cbe47da17a30787a9469d0ce85670611979fffed WatchSource:0}: Error finding container b87cdedd15337500e74b18d0cbe47da17a30787a9469d0ce85670611979fffed: Status 404 returned error can't find the container with id b87cdedd15337500e74b18d0cbe47da17a30787a9469d0ce85670611979fffed Feb 18 06:08:11 crc kubenswrapper[4869]: I0218 06:08:11.789450 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 06:08:12 crc kubenswrapper[4869]: I0218 06:08:12.520294 4869 generic.go:334] "Generic (PLEG): container finished" podID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerID="3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003" exitCode=0 Feb 18 06:08:12 crc kubenswrapper[4869]: I0218 06:08:12.520375 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" event={"ID":"a3515130-3f9f-42ab-8bc7-6d357e1d645a","Type":"ContainerDied","Data":"3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003"} Feb 18 06:08:12 crc kubenswrapper[4869]: I0218 06:08:12.520399 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" event={"ID":"a3515130-3f9f-42ab-8bc7-6d357e1d645a","Type":"ContainerStarted","Data":"b87cdedd15337500e74b18d0cbe47da17a30787a9469d0ce85670611979fffed"} Feb 18 06:08:12 crc kubenswrapper[4869]: I0218 06:08:12.963200 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.166832 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.530145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" event={"ID":"a3515130-3f9f-42ab-8bc7-6d357e1d645a","Type":"ContainerStarted","Data":"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c"} Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.531387 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.534030 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-log" containerID="cri-o://5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd" gracePeriod=30 Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.535127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerStarted","Data":"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833"} Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.535162 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.535224 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-api" containerID="cri-o://3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9" gracePeriod=30 Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.559126 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" podStartSLOduration=3.559110584 podStartE2EDuration="3.559110584s" podCreationTimestamp="2026-02-18 06:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:13.554108259 +0000 UTC m=+1190.723196491" watchObservedRunningTime="2026-02-18 06:08:13.559110584 +0000 UTC m=+1190.728198816" Feb 18 06:08:13 crc kubenswrapper[4869]: I0218 06:08:13.604710 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.103585502 podStartE2EDuration="5.604692032s" podCreationTimestamp="2026-02-18 06:08:08 +0000 UTC" firstStartedPulling="2026-02-18 06:08:09.170374766 +0000 UTC m=+1186.339462998" lastFinishedPulling="2026-02-18 06:08:12.671481296 +0000 UTC m=+1189.840569528" observedRunningTime="2026-02-18 06:08:13.602651578 +0000 UTC m=+1190.771739810" watchObservedRunningTime="2026-02-18 06:08:13.604692032 +0000 UTC m=+1190.773780264" Feb 18 06:08:14 crc kubenswrapper[4869]: I0218 06:08:14.550720 4869 generic.go:334] "Generic (PLEG): container finished" podID="90248f87-9c36-4636-8c28-366356c9924e" containerID="5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd" exitCode=143 Feb 18 06:08:14 crc kubenswrapper[4869]: I0218 06:08:14.550780 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerDied","Data":"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd"} Feb 18 06:08:14 crc kubenswrapper[4869]: I0218 06:08:14.644924 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:15 crc kubenswrapper[4869]: I0218 06:08:15.563297 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-central-agent" containerID="cri-o://fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45" gracePeriod=30 Feb 18 06:08:15 crc kubenswrapper[4869]: I0218 06:08:15.563383 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="proxy-httpd" containerID="cri-o://3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833" gracePeriod=30 Feb 18 06:08:15 crc kubenswrapper[4869]: I0218 06:08:15.563458 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="sg-core" containerID="cri-o://274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91" gracePeriod=30 Feb 18 06:08:15 crc kubenswrapper[4869]: I0218 06:08:15.563477 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-notification-agent" containerID="cri-o://06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7" gracePeriod=30 Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593098 4869 generic.go:334] "Generic (PLEG): container finished" podID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerID="3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833" exitCode=0 Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593468 4869 generic.go:334] "Generic (PLEG): container finished" podID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerID="274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91" exitCode=2 Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593485 4869 generic.go:334] "Generic (PLEG): container finished" podID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerID="06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7" exitCode=0 Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593165 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerDied","Data":"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833"} Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593532 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerDied","Data":"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91"} Feb 18 06:08:16 crc kubenswrapper[4869]: I0218 06:08:16.593556 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerDied","Data":"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7"} Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.134609 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.244070 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs\") pod \"90248f87-9c36-4636-8c28-366356c9924e\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.244164 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhn64\" (UniqueName: \"kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64\") pod \"90248f87-9c36-4636-8c28-366356c9924e\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.244311 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data\") pod \"90248f87-9c36-4636-8c28-366356c9924e\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.244414 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle\") pod \"90248f87-9c36-4636-8c28-366356c9924e\" (UID: \"90248f87-9c36-4636-8c28-366356c9924e\") " Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.244642 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs" (OuterVolumeSpecName: "logs") pod "90248f87-9c36-4636-8c28-366356c9924e" (UID: "90248f87-9c36-4636-8c28-366356c9924e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.245194 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90248f87-9c36-4636-8c28-366356c9924e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.255415 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64" (OuterVolumeSpecName: "kube-api-access-zhn64") pod "90248f87-9c36-4636-8c28-366356c9924e" (UID: "90248f87-9c36-4636-8c28-366356c9924e"). InnerVolumeSpecName "kube-api-access-zhn64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.291349 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90248f87-9c36-4636-8c28-366356c9924e" (UID: "90248f87-9c36-4636-8c28-366356c9924e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.291981 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data" (OuterVolumeSpecName: "config-data") pod "90248f87-9c36-4636-8c28-366356c9924e" (UID: "90248f87-9c36-4636-8c28-366356c9924e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.346646 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.346679 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhn64\" (UniqueName: \"kubernetes.io/projected/90248f87-9c36-4636-8c28-366356c9924e-kube-api-access-zhn64\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.346692 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90248f87-9c36-4636-8c28-366356c9924e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.606504 4869 generic.go:334] "Generic (PLEG): container finished" podID="90248f87-9c36-4636-8c28-366356c9924e" containerID="3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9" exitCode=0 Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.606845 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.606841 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerDied","Data":"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9"} Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.607587 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90248f87-9c36-4636-8c28-366356c9924e","Type":"ContainerDied","Data":"3a6eab19f05944941e3f3fef36d9001fb15375ca5b7328d4a702947d5cbe525d"} Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.607641 4869 scope.go:117] "RemoveContainer" containerID="3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.642139 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.644499 4869 scope.go:117] "RemoveContainer" containerID="5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.659734 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.670305 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:17 crc kubenswrapper[4869]: E0218 06:08:17.670731 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-log" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.670762 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-log" Feb 18 06:08:17 crc kubenswrapper[4869]: E0218 06:08:17.670789 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-api" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.670796 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-api" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.670979 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-api" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.671004 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="90248f87-9c36-4636-8c28-366356c9924e" containerName="nova-api-log" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.671981 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.674877 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.675054 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.675215 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.677170 4869 scope.go:117] "RemoveContainer" containerID="3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9" Feb 18 06:08:17 crc kubenswrapper[4869]: E0218 06:08:17.677759 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9\": container with ID starting with 3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9 not found: ID does not exist" containerID="3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.677797 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9"} err="failed to get container status \"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9\": rpc error: code = NotFound desc = could not find container \"3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9\": container with ID starting with 3fa8903f8d74e5660856497d96efa0c272fd87ce3e48036f5fe2f837939e70d9 not found: ID does not exist" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.677832 4869 scope.go:117] "RemoveContainer" containerID="5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd" Feb 18 06:08:17 crc kubenswrapper[4869]: E0218 06:08:17.678135 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd\": container with ID starting with 5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd not found: ID does not exist" containerID="5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.678165 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd"} err="failed to get container status \"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd\": rpc error: code = NotFound desc = could not find container \"5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd\": container with ID starting with 5a50e19c14bf8ce37863afde944897ab92dc1ed19bcd118b405b8f910f8abbcd not found: ID does not exist" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.697606 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.756845 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.756886 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.756955 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.757025 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnd94\" (UniqueName: \"kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.757201 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.757274 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.858691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.858847 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnd94\" (UniqueName: \"kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.858917 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.858969 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.859017 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.859042 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.859619 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.863399 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.863488 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.863639 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.865225 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.882445 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnd94\" (UniqueName: \"kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94\") pod \"nova-api-0\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " pod="openstack/nova-api-0" Feb 18 06:08:17 crc kubenswrapper[4869]: I0218 06:08:17.987228 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.166896 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.200414 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.327393 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366557 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366608 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366631 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366734 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfqpw\" (UniqueName: \"kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366850 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366894 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.366969 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.367060 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml\") pod \"88188cac-b45f-4edb-bed3-c45ca99907b0\" (UID: \"88188cac-b45f-4edb-bed3-c45ca99907b0\") " Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.367120 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.367571 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.367792 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.367809 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88188cac-b45f-4edb-bed3-c45ca99907b0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.373482 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts" (OuterVolumeSpecName: "scripts") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.374360 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw" (OuterVolumeSpecName: "kube-api-access-vfqpw") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "kube-api-access-vfqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.401338 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.421190 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.479028 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.479058 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.479069 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfqpw\" (UniqueName: \"kubernetes.io/projected/88188cac-b45f-4edb-bed3-c45ca99907b0-kube-api-access-vfqpw\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.479081 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.486809 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.508519 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data" (OuterVolumeSpecName: "config-data") pod "88188cac-b45f-4edb-bed3-c45ca99907b0" (UID: "88188cac-b45f-4edb-bed3-c45ca99907b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.520389 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.581130 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.581481 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88188cac-b45f-4edb-bed3-c45ca99907b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.624734 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerStarted","Data":"f71534b6d1deef0cdf174f2c7537837964f00e5819320ba29fea1156eaaba382"} Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.628077 4869 generic.go:334] "Generic (PLEG): container finished" podID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerID="fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45" exitCode=0 Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.628134 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerDied","Data":"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45"} Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.628157 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.628189 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88188cac-b45f-4edb-bed3-c45ca99907b0","Type":"ContainerDied","Data":"e348620c24c453a462f84689839f71c68868427662e26758e146dfac845d20ff"} Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.628201 4869 scope.go:117] "RemoveContainer" containerID="3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.650890 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.655884 4869 scope.go:117] "RemoveContainer" containerID="274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.668692 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.678761 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.684165 4869 scope.go:117] "RemoveContainer" containerID="06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.715510 4869 scope.go:117] "RemoveContainer" containerID="fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.725487 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.725890 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-central-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.725902 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-central-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.725919 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="proxy-httpd" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.725924 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="proxy-httpd" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.725937 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="sg-core" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.725945 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="sg-core" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.725959 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-notification-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.725965 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-notification-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.726153 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="sg-core" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.726174 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="proxy-httpd" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.726187 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-notification-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.726202 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" containerName="ceilometer-central-agent" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.728120 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.734428 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.734643 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.734814 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.742513 4869 scope.go:117] "RemoveContainer" containerID="3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.743798 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833\": container with ID starting with 3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833 not found: ID does not exist" containerID="3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.743830 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833"} err="failed to get container status \"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833\": rpc error: code = NotFound desc = could not find container \"3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833\": container with ID starting with 3ff289d5d7e210e2fd20c6d0fefc0b3306e09c4773a7807ad36fba4747019833 not found: ID does not exist" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.743852 4869 scope.go:117] "RemoveContainer" containerID="274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.755420 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.759712 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91\": container with ID starting with 274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91 not found: ID does not exist" containerID="274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.759781 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91"} err="failed to get container status \"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91\": rpc error: code = NotFound desc = could not find container \"274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91\": container with ID starting with 274f91e44f9ea1240fa0bc824947f3ef97bd7b0e7d9ef5bed3431b7b5aa47b91 not found: ID does not exist" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.759810 4869 scope.go:117] "RemoveContainer" containerID="06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.777203 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7\": container with ID starting with 06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7 not found: ID does not exist" containerID="06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.777253 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7"} err="failed to get container status \"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7\": rpc error: code = NotFound desc = could not find container \"06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7\": container with ID starting with 06d272d643f14361588b6dc6750d794517f0f62f37269e69e002eedca444b5d7 not found: ID does not exist" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.777287 4869 scope.go:117] "RemoveContainer" containerID="fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45" Feb 18 06:08:18 crc kubenswrapper[4869]: E0218 06:08:18.777902 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45\": container with ID starting with fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45 not found: ID does not exist" containerID="fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.777931 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45"} err="failed to get container status \"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45\": rpc error: code = NotFound desc = could not find container \"fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45\": container with ID starting with fca614aceec0f62ccafcb0b71e8ec51950359f08d048a03e86c1a703cb96ba45 not found: ID does not exist" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-run-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785178 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785245 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785283 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-log-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785327 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-scripts\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785346 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785362 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnn54\" (UniqueName: \"kubernetes.io/projected/e0226ffe-db7d-48b2-acee-8a9f7045c083-kube-api-access-nnn54\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.785508 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-config-data\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.864177 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7ctxv"] Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.871707 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.874313 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.874382 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.875906 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7ctxv"] Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.886917 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.886971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-log-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-scripts\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887059 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887077 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnn54\" (UniqueName: \"kubernetes.io/projected/e0226ffe-db7d-48b2-acee-8a9f7045c083-kube-api-access-nnn54\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887122 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-config-data\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887177 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-run-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.887201 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.892695 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-log-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.892970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0226ffe-db7d-48b2-acee-8a9f7045c083-run-httpd\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.894921 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.895602 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-config-data\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.897469 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.898290 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-scripts\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.902609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0226ffe-db7d-48b2-acee-8a9f7045c083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.907709 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnn54\" (UniqueName: \"kubernetes.io/projected/e0226ffe-db7d-48b2-acee-8a9f7045c083-kube-api-access-nnn54\") pod \"ceilometer-0\" (UID: \"e0226ffe-db7d-48b2-acee-8a9f7045c083\") " pod="openstack/ceilometer-0" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.989506 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhvbn\" (UniqueName: \"kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.989565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.989636 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:18 crc kubenswrapper[4869]: I0218 06:08:18.990085 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.062254 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.095124 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.095424 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhvbn\" (UniqueName: \"kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.096418 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.096650 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.100763 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.101875 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.110357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.119452 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhvbn\" (UniqueName: \"kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn\") pod \"nova-cell1-cell-mapping-7ctxv\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.325871 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.494382 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88188cac-b45f-4edb-bed3-c45ca99907b0" path="/var/lib/kubelet/pods/88188cac-b45f-4edb-bed3-c45ca99907b0/volumes" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.495279 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90248f87-9c36-4636-8c28-366356c9924e" path="/var/lib/kubelet/pods/90248f87-9c36-4636-8c28-366356c9924e/volumes" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.628316 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:19 crc kubenswrapper[4869]: W0218 06:08:19.659851 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0226ffe_db7d_48b2_acee_8a9f7045c083.slice/crio-73ccd8354095b5c543a1bfd22924f15b3db4ad50001bf06d705df17a58d56727 WatchSource:0}: Error finding container 73ccd8354095b5c543a1bfd22924f15b3db4ad50001bf06d705df17a58d56727: Status 404 returned error can't find the container with id 73ccd8354095b5c543a1bfd22924f15b3db4ad50001bf06d705df17a58d56727 Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.686201 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerStarted","Data":"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543"} Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.686239 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerStarted","Data":"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab"} Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.763427 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7633961080000002 podStartE2EDuration="2.763396108s" podCreationTimestamp="2026-02-18 06:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:19.729183612 +0000 UTC m=+1196.898271844" watchObservedRunningTime="2026-02-18 06:08:19.763396108 +0000 UTC m=+1196.932484340" Feb 18 06:08:19 crc kubenswrapper[4869]: I0218 06:08:19.910034 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7ctxv"] Feb 18 06:08:20 crc kubenswrapper[4869]: I0218 06:08:20.740072 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7ctxv" event={"ID":"9913d3a8-cea5-410a-a39e-270de53de317","Type":"ContainerStarted","Data":"baabe3f7136cd63129596c7e3c26aeaa0e47832cc6f1bfe5719b3cd61537ab40"} Feb 18 06:08:20 crc kubenswrapper[4869]: I0218 06:08:20.740930 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7ctxv" event={"ID":"9913d3a8-cea5-410a-a39e-270de53de317","Type":"ContainerStarted","Data":"400c29d1926ede95538657a8c3d3a1ba4afa69d6615d623dfa7b188fb09530f4"} Feb 18 06:08:20 crc kubenswrapper[4869]: I0218 06:08:20.770236 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0226ffe-db7d-48b2-acee-8a9f7045c083","Type":"ContainerStarted","Data":"975fa8b7886718a8e2cd47bb5a28ed64151887ec65ca77cc483d1d55057208bd"} Feb 18 06:08:20 crc kubenswrapper[4869]: I0218 06:08:20.770633 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0226ffe-db7d-48b2-acee-8a9f7045c083","Type":"ContainerStarted","Data":"73ccd8354095b5c543a1bfd22924f15b3db4ad50001bf06d705df17a58d56727"} Feb 18 06:08:20 crc kubenswrapper[4869]: I0218 06:08:20.772546 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7ctxv" podStartSLOduration=2.772516816 podStartE2EDuration="2.772516816s" podCreationTimestamp="2026-02-18 06:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:20.766420972 +0000 UTC m=+1197.935509214" watchObservedRunningTime="2026-02-18 06:08:20.772516816 +0000 UTC m=+1197.941605058" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.051973 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.128067 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.128369 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="dnsmasq-dns" containerID="cri-o://e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1" gracePeriod=10 Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.606416 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.664964 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.665083 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.665153 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.665192 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.665218 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.665367 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tldtq\" (UniqueName: \"kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq\") pod \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\" (UID: \"cbaac7cd-b5f2-40f5-8482-c29eafa37f95\") " Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.695705 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq" (OuterVolumeSpecName: "kube-api-access-tldtq") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "kube-api-access-tldtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.767848 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tldtq\" (UniqueName: \"kubernetes.io/projected/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-kube-api-access-tldtq\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.796244 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.802333 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0226ffe-db7d-48b2-acee-8a9f7045c083","Type":"ContainerStarted","Data":"4bcfedb77509a8776b529680e7fb7779fa136207ffe770ca7133445a435c03e2"} Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.804381 4869 generic.go:334] "Generic (PLEG): container finished" podID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerID="e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1" exitCode=0 Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.804675 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.805481 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" event={"ID":"cbaac7cd-b5f2-40f5-8482-c29eafa37f95","Type":"ContainerDied","Data":"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1"} Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.805505 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-wrp7j" event={"ID":"cbaac7cd-b5f2-40f5-8482-c29eafa37f95","Type":"ContainerDied","Data":"a326fbff0636845ea56e113ed2109900fa5cb07d1e2c80a6ae60d809c1629f3a"} Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.805522 4869 scope.go:117] "RemoveContainer" containerID="e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.815445 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.834232 4869 scope.go:117] "RemoveContainer" containerID="4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.835000 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config" (OuterVolumeSpecName: "config") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.841194 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.842734 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cbaac7cd-b5f2-40f5-8482-c29eafa37f95" (UID: "cbaac7cd-b5f2-40f5-8482-c29eafa37f95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.861191 4869 scope.go:117] "RemoveContainer" containerID="e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1" Feb 18 06:08:21 crc kubenswrapper[4869]: E0218 06:08:21.861706 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1\": container with ID starting with e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1 not found: ID does not exist" containerID="e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.861757 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1"} err="failed to get container status \"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1\": rpc error: code = NotFound desc = could not find container \"e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1\": container with ID starting with e7d4c7549611d9133c3ca1001f91e2c44e6e2d2848304616d2f74ea61eb2f5a1 not found: ID does not exist" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.861779 4869 scope.go:117] "RemoveContainer" containerID="4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c" Feb 18 06:08:21 crc kubenswrapper[4869]: E0218 06:08:21.862147 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c\": container with ID starting with 4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c not found: ID does not exist" containerID="4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.862169 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c"} err="failed to get container status \"4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c\": rpc error: code = NotFound desc = could not find container \"4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c\": container with ID starting with 4ad4c1c36649bb86f32dbe7a86f98fbdd7583556a226e98328f07c7c2292a91c not found: ID does not exist" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.870124 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.870559 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.870570 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.870580 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:21 crc kubenswrapper[4869]: I0218 06:08:21.870588 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbaac7cd-b5f2-40f5-8482-c29eafa37f95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:22 crc kubenswrapper[4869]: I0218 06:08:22.144857 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:08:22 crc kubenswrapper[4869]: I0218 06:08:22.165164 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-wrp7j"] Feb 18 06:08:22 crc kubenswrapper[4869]: I0218 06:08:22.817586 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0226ffe-db7d-48b2-acee-8a9f7045c083","Type":"ContainerStarted","Data":"a31d7a3a2099c8e3712502bfef8d1bfb0b5f4d7fc0803f0b1990ce366ef89057"} Feb 18 06:08:23 crc kubenswrapper[4869]: I0218 06:08:23.509998 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" path="/var/lib/kubelet/pods/cbaac7cd-b5f2-40f5-8482-c29eafa37f95/volumes" Feb 18 06:08:23 crc kubenswrapper[4869]: I0218 06:08:23.831274 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0226ffe-db7d-48b2-acee-8a9f7045c083","Type":"ContainerStarted","Data":"8d152dee59d9824c8e773919dbf3bee83ce49b26d612a98607d43d078b88a20e"} Feb 18 06:08:23 crc kubenswrapper[4869]: I0218 06:08:23.831547 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:08:23 crc kubenswrapper[4869]: I0218 06:08:23.862115 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.34994515 podStartE2EDuration="5.862095655s" podCreationTimestamp="2026-02-18 06:08:18 +0000 UTC" firstStartedPulling="2026-02-18 06:08:19.678964465 +0000 UTC m=+1196.848052697" lastFinishedPulling="2026-02-18 06:08:23.19111497 +0000 UTC m=+1200.360203202" observedRunningTime="2026-02-18 06:08:23.859789897 +0000 UTC m=+1201.028878169" watchObservedRunningTime="2026-02-18 06:08:23.862095655 +0000 UTC m=+1201.031183897" Feb 18 06:08:25 crc kubenswrapper[4869]: I0218 06:08:25.863280 4869 generic.go:334] "Generic (PLEG): container finished" podID="9913d3a8-cea5-410a-a39e-270de53de317" containerID="baabe3f7136cd63129596c7e3c26aeaa0e47832cc6f1bfe5719b3cd61537ab40" exitCode=0 Feb 18 06:08:25 crc kubenswrapper[4869]: I0218 06:08:25.863363 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7ctxv" event={"ID":"9913d3a8-cea5-410a-a39e-270de53de317","Type":"ContainerDied","Data":"baabe3f7136cd63129596c7e3c26aeaa0e47832cc6f1bfe5719b3cd61537ab40"} Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.221352 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.282351 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhvbn\" (UniqueName: \"kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn\") pod \"9913d3a8-cea5-410a-a39e-270de53de317\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.282615 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data\") pod \"9913d3a8-cea5-410a-a39e-270de53de317\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.282773 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle\") pod \"9913d3a8-cea5-410a-a39e-270de53de317\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.282876 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts\") pod \"9913d3a8-cea5-410a-a39e-270de53de317\" (UID: \"9913d3a8-cea5-410a-a39e-270de53de317\") " Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.288267 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts" (OuterVolumeSpecName: "scripts") pod "9913d3a8-cea5-410a-a39e-270de53de317" (UID: "9913d3a8-cea5-410a-a39e-270de53de317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.307048 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn" (OuterVolumeSpecName: "kube-api-access-fhvbn") pod "9913d3a8-cea5-410a-a39e-270de53de317" (UID: "9913d3a8-cea5-410a-a39e-270de53de317"). InnerVolumeSpecName "kube-api-access-fhvbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.316921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data" (OuterVolumeSpecName: "config-data") pod "9913d3a8-cea5-410a-a39e-270de53de317" (UID: "9913d3a8-cea5-410a-a39e-270de53de317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.318654 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9913d3a8-cea5-410a-a39e-270de53de317" (UID: "9913d3a8-cea5-410a-a39e-270de53de317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.385819 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.385856 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.385868 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhvbn\" (UniqueName: \"kubernetes.io/projected/9913d3a8-cea5-410a-a39e-270de53de317-kube-api-access-fhvbn\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.385883 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9913d3a8-cea5-410a-a39e-270de53de317-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.886834 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7ctxv" event={"ID":"9913d3a8-cea5-410a-a39e-270de53de317","Type":"ContainerDied","Data":"400c29d1926ede95538657a8c3d3a1ba4afa69d6615d623dfa7b188fb09530f4"} Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.887003 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="400c29d1926ede95538657a8c3d3a1ba4afa69d6615d623dfa7b188fb09530f4" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.887238 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7ctxv" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.988207 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:27 crc kubenswrapper[4869]: I0218 06:08:27.988269 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.092798 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.106316 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.106577 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerName="nova-scheduler-scheduler" containerID="cri-o://88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" gracePeriod=30 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.124932 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.125187 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" containerID="cri-o://73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284" gracePeriod=30 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.125358 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" containerID="cri-o://065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68" gracePeriod=30 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.898811 4869 generic.go:334] "Generic (PLEG): container finished" podID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerID="73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284" exitCode=143 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.898898 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerDied","Data":"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284"} Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.899264 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-log" containerID="cri-o://6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab" gracePeriod=30 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.899302 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-api" containerID="cri-o://75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543" gracePeriod=30 Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.904255 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": EOF" Feb 18 06:08:28 crc kubenswrapper[4869]: I0218 06:08:28.904277 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": EOF" Feb 18 06:08:29 crc kubenswrapper[4869]: I0218 06:08:29.915699 4869 generic.go:334] "Generic (PLEG): container finished" podID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerID="6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab" exitCode=143 Feb 18 06:08:29 crc kubenswrapper[4869]: I0218 06:08:29.915957 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerDied","Data":"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab"} Feb 18 06:08:29 crc kubenswrapper[4869]: E0218 06:08:29.974079 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:08:29 crc kubenswrapper[4869]: E0218 06:08:29.975368 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:08:29 crc kubenswrapper[4869]: E0218 06:08:29.976598 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 06:08:29 crc kubenswrapper[4869]: E0218 06:08:29.976674 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerName="nova-scheduler-scheduler" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.259977 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:34244->10.217.0.195:8775: read: connection reset by peer" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.261032 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:34246->10.217.0.195:8775: read: connection reset by peer" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.767508 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.879952 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxhd6\" (UniqueName: \"kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6\") pod \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.880175 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs\") pod \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.880259 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data\") pod \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.880375 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs\") pod \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.880454 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle\") pod \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\" (UID: \"79bdd2ba-f727-450b-a1ee-08dd5c68e84f\") " Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.880712 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs" (OuterVolumeSpecName: "logs") pod "79bdd2ba-f727-450b-a1ee-08dd5c68e84f" (UID: "79bdd2ba-f727-450b-a1ee-08dd5c68e84f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.881269 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.897893 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6" (OuterVolumeSpecName: "kube-api-access-hxhd6") pod "79bdd2ba-f727-450b-a1ee-08dd5c68e84f" (UID: "79bdd2ba-f727-450b-a1ee-08dd5c68e84f"). InnerVolumeSpecName "kube-api-access-hxhd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.917229 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79bdd2ba-f727-450b-a1ee-08dd5c68e84f" (UID: "79bdd2ba-f727-450b-a1ee-08dd5c68e84f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.921117 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data" (OuterVolumeSpecName: "config-data") pod "79bdd2ba-f727-450b-a1ee-08dd5c68e84f" (UID: "79bdd2ba-f727-450b-a1ee-08dd5c68e84f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.948296 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "79bdd2ba-f727-450b-a1ee-08dd5c68e84f" (UID: "79bdd2ba-f727-450b-a1ee-08dd5c68e84f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.961847 4869 generic.go:334] "Generic (PLEG): container finished" podID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerID="065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68" exitCode=0 Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.961891 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerDied","Data":"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68"} Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.961922 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79bdd2ba-f727-450b-a1ee-08dd5c68e84f","Type":"ContainerDied","Data":"a525d04f1dea3b855d455276e4ffaa21fc705e0e900dd1bf5aa95ff2a5b2ab37"} Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.961940 4869 scope.go:117] "RemoveContainer" containerID="065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.962063 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.982650 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.982685 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.982696 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:31 crc kubenswrapper[4869]: I0218 06:08:31.982706 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxhd6\" (UniqueName: \"kubernetes.io/projected/79bdd2ba-f727-450b-a1ee-08dd5c68e84f-kube-api-access-hxhd6\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.002330 4869 scope.go:117] "RemoveContainer" containerID="73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.002914 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.012042 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027186 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.027632 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="init" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027655 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="init" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.027674 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9913d3a8-cea5-410a-a39e-270de53de317" containerName="nova-manage" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027682 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9913d3a8-cea5-410a-a39e-270de53de317" containerName="nova-manage" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.027711 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027719 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.027732 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="dnsmasq-dns" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027770 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="dnsmasq-dns" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.027863 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.027873 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.028353 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-metadata" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.028417 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbaac7cd-b5f2-40f5-8482-c29eafa37f95" containerName="dnsmasq-dns" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.028435 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" containerName="nova-metadata-log" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.028444 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9913d3a8-cea5-410a-a39e-270de53de317" containerName="nova-manage" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.032985 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.036460 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.036607 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.040851 4869 scope.go:117] "RemoveContainer" containerID="065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.041377 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68\": container with ID starting with 065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68 not found: ID does not exist" containerID="065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.041415 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68"} err="failed to get container status \"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68\": rpc error: code = NotFound desc = could not find container \"065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68\": container with ID starting with 065a24d0c8e4d94507fe2025b0fd4599260ee794c44483e2eefc1f88def23f68 not found: ID does not exist" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.041444 4869 scope.go:117] "RemoveContainer" containerID="73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284" Feb 18 06:08:32 crc kubenswrapper[4869]: E0218 06:08:32.041672 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284\": container with ID starting with 73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284 not found: ID does not exist" containerID="73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.041701 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284"} err="failed to get container status \"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284\": rpc error: code = NotFound desc = could not find container \"73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284\": container with ID starting with 73c6e2ec4928699d25deb2f1c78fa2c65bf8b497fbeff5f8e21135d4e8f1c284 not found: ID does not exist" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.053987 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.187429 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-config-data\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.187684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.187733 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.187853 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-logs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.187894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bkb\" (UniqueName: \"kubernetes.io/projected/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-kube-api-access-f7bkb\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.290282 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-config-data\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.290354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.290399 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.290491 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-logs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.290534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bkb\" (UniqueName: \"kubernetes.io/projected/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-kube-api-access-f7bkb\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.292053 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-logs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.294798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-config-data\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.294798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.296203 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.310118 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bkb\" (UniqueName: \"kubernetes.io/projected/eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f-kube-api-access-f7bkb\") pod \"nova-metadata-0\" (UID: \"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.353944 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.911945 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:32 crc kubenswrapper[4869]: I0218 06:08:32.977012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f","Type":"ContainerStarted","Data":"82cac78fdcd46959e0bc23a700d2b02a5c37e125546645f5da333eebcfcf5605"} Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.490539 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bdd2ba-f727-450b-a1ee-08dd5c68e84f" path="/var/lib/kubelet/pods/79bdd2ba-f727-450b-a1ee-08dd5c68e84f/volumes" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.684484 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.748543 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data\") pod \"066c0252-91a6-43e7-9164-3576a49ec5f8\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.748598 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle\") pod \"066c0252-91a6-43e7-9164-3576a49ec5f8\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.748724 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhrd\" (UniqueName: \"kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd\") pod \"066c0252-91a6-43e7-9164-3576a49ec5f8\" (UID: \"066c0252-91a6-43e7-9164-3576a49ec5f8\") " Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.756626 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd" (OuterVolumeSpecName: "kube-api-access-7xhrd") pod "066c0252-91a6-43e7-9164-3576a49ec5f8" (UID: "066c0252-91a6-43e7-9164-3576a49ec5f8"). InnerVolumeSpecName "kube-api-access-7xhrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.781887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "066c0252-91a6-43e7-9164-3576a49ec5f8" (UID: "066c0252-91a6-43e7-9164-3576a49ec5f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.789499 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data" (OuterVolumeSpecName: "config-data") pod "066c0252-91a6-43e7-9164-3576a49ec5f8" (UID: "066c0252-91a6-43e7-9164-3576a49ec5f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.851020 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.851051 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066c0252-91a6-43e7-9164-3576a49ec5f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.851062 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhrd\" (UniqueName: \"kubernetes.io/projected/066c0252-91a6-43e7-9164-3576a49ec5f8-kube-api-access-7xhrd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.994656 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f","Type":"ContainerStarted","Data":"2d55c2544effa6d959232f9950d5eda4766a446996fdb668fbeccf32fc6ae8f7"} Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.994702 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f","Type":"ContainerStarted","Data":"9878e7ab67793e4c90347a69c22ee23fb0340d29fde2734c3f2ee0251097e7d3"} Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.997451 4869 generic.go:334] "Generic (PLEG): container finished" podID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" exitCode=0 Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.997472 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.997539 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"066c0252-91a6-43e7-9164-3576a49ec5f8","Type":"ContainerDied","Data":"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69"} Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.997586 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"066c0252-91a6-43e7-9164-3576a49ec5f8","Type":"ContainerDied","Data":"8edbf11ef5c8e9542ddcf5e14cd07647ac25a91201b049f0c5ec444b2d683fce"} Feb 18 06:08:33 crc kubenswrapper[4869]: I0218 06:08:33.997607 4869 scope.go:117] "RemoveContainer" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.028171 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.028152099 podStartE2EDuration="2.028152099s" podCreationTimestamp="2026-02-18 06:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:34.016271369 +0000 UTC m=+1211.185359601" watchObservedRunningTime="2026-02-18 06:08:34.028152099 +0000 UTC m=+1211.197240331" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.052624 4869 scope.go:117] "RemoveContainer" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.053379 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:34 crc kubenswrapper[4869]: E0218 06:08:34.054111 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69\": container with ID starting with 88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69 not found: ID does not exist" containerID="88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.054150 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69"} err="failed to get container status \"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69\": rpc error: code = NotFound desc = could not find container \"88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69\": container with ID starting with 88906d1972ff554fef8ad7ed81d3dbd4423d493df8fb5560849abc028bfdea69 not found: ID does not exist" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.062932 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.070948 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:34 crc kubenswrapper[4869]: E0218 06:08:34.071329 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerName="nova-scheduler-scheduler" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.071344 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerName="nova-scheduler-scheduler" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.071525 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" containerName="nova-scheduler-scheduler" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.072148 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.078218 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.090722 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.155586 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-config-data\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.155770 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.155811 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfttd\" (UniqueName: \"kubernetes.io/projected/f31fe572-a7a1-48f4-8ebb-e621788c2456-kube-api-access-wfttd\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.256937 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.257244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfttd\" (UniqueName: \"kubernetes.io/projected/f31fe572-a7a1-48f4-8ebb-e621788c2456-kube-api-access-wfttd\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.257410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-config-data\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.262369 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.270336 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fe572-a7a1-48f4-8ebb-e621788c2456-config-data\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.281614 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfttd\" (UniqueName: \"kubernetes.io/projected/f31fe572-a7a1-48f4-8ebb-e621788c2456-kube-api-access-wfttd\") pod \"nova-scheduler-0\" (UID: \"f31fe572-a7a1-48f4-8ebb-e621788c2456\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.398215 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.891164 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:34 crc kubenswrapper[4869]: W0218 06:08:34.897965 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31fe572_a7a1_48f4_8ebb_e621788c2456.slice/crio-976dda0e8ed769996150d330faa91ff4e1ba74c0b1d58aa9a0601295ca70ddad WatchSource:0}: Error finding container 976dda0e8ed769996150d330faa91ff4e1ba74c0b1d58aa9a0601295ca70ddad: Status 404 returned error can't find the container with id 976dda0e8ed769996150d330faa91ff4e1ba74c0b1d58aa9a0601295ca70ddad Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.958998 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.970999 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971073 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971091 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971113 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971149 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971226 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnd94\" (UniqueName: \"kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94\") pod \"df29ddad-6564-4dfb-9b24-e344ac6535f3\" (UID: \"df29ddad-6564-4dfb-9b24-e344ac6535f3\") " Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.971968 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs" (OuterVolumeSpecName: "logs") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:34 crc kubenswrapper[4869]: I0218 06:08:34.977006 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94" (OuterVolumeSpecName: "kube-api-access-xnd94") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "kube-api-access-xnd94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.018199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f31fe572-a7a1-48f4-8ebb-e621788c2456","Type":"ContainerStarted","Data":"976dda0e8ed769996150d330faa91ff4e1ba74c0b1d58aa9a0601295ca70ddad"} Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.020722 4869 generic.go:334] "Generic (PLEG): container finished" podID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerID="75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543" exitCode=0 Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.020960 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerDied","Data":"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543"} Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.021003 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"df29ddad-6564-4dfb-9b24-e344ac6535f3","Type":"ContainerDied","Data":"f71534b6d1deef0cdf174f2c7537837964f00e5819320ba29fea1156eaaba382"} Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.021026 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.021026 4869 scope.go:117] "RemoveContainer" containerID="75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.033056 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.043695 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data" (OuterVolumeSpecName: "config-data") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.046162 4869 scope.go:117] "RemoveContainer" containerID="6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.064762 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.067879 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "df29ddad-6564-4dfb-9b24-e344ac6535f3" (UID: "df29ddad-6564-4dfb-9b24-e344ac6535f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074295 4869 scope.go:117] "RemoveContainer" containerID="75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074306 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074384 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074396 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074407 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df29ddad-6564-4dfb-9b24-e344ac6535f3-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074419 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df29ddad-6564-4dfb-9b24-e344ac6535f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.074431 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnd94\" (UniqueName: \"kubernetes.io/projected/df29ddad-6564-4dfb-9b24-e344ac6535f3-kube-api-access-xnd94\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:35 crc kubenswrapper[4869]: E0218 06:08:35.075270 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543\": container with ID starting with 75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543 not found: ID does not exist" containerID="75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.075301 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543"} err="failed to get container status \"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543\": rpc error: code = NotFound desc = could not find container \"75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543\": container with ID starting with 75e748a23c8b8ca4e811ebf712198d6e5c1c1e36f1191dbdd747390340b00543 not found: ID does not exist" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.075334 4869 scope.go:117] "RemoveContainer" containerID="6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab" Feb 18 06:08:35 crc kubenswrapper[4869]: E0218 06:08:35.075624 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab\": container with ID starting with 6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab not found: ID does not exist" containerID="6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.075651 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab"} err="failed to get container status \"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab\": rpc error: code = NotFound desc = could not find container \"6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab\": container with ID starting with 6f30be7c865b134b199df7710ebb4f143c4e1ce07c44360ad7293603551b71ab not found: ID does not exist" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.369898 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.389879 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.399178 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:35 crc kubenswrapper[4869]: E0218 06:08:35.399944 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-api" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.400051 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-api" Feb 18 06:08:35 crc kubenswrapper[4869]: E0218 06:08:35.400113 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-log" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.400174 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-log" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.400452 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-log" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.400524 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" containerName="nova-api-api" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.401914 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.407434 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.407832 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.408605 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.413800 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.480458 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066c0252-91a6-43e7-9164-3576a49ec5f8" path="/var/lib/kubelet/pods/066c0252-91a6-43e7-9164-3576a49ec5f8/volumes" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.481169 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df29ddad-6564-4dfb-9b24-e344ac6535f3" path="/var/lib/kubelet/pods/df29ddad-6564-4dfb-9b24-e344ac6535f3/volumes" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482099 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-logs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482167 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhfx\" (UniqueName: \"kubernetes.io/projected/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-kube-api-access-4nhfx\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482224 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482285 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.482304 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-config-data\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.585575 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-logs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.585944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhfx\" (UniqueName: \"kubernetes.io/projected/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-kube-api-access-4nhfx\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.586059 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.586187 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-logs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.586298 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.587326 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.587366 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-config-data\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.590636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.591433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.591653 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.592847 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-config-data\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.617497 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhfx\" (UniqueName: \"kubernetes.io/projected/1b848f50-bd92-4c86-8f5e-64c4fd3e2521-kube-api-access-4nhfx\") pod \"nova-api-0\" (UID: \"1b848f50-bd92-4c86-8f5e-64c4fd3e2521\") " pod="openstack/nova-api-0" Feb 18 06:08:35 crc kubenswrapper[4869]: I0218 06:08:35.738266 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:36 crc kubenswrapper[4869]: I0218 06:08:36.034895 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f31fe572-a7a1-48f4-8ebb-e621788c2456","Type":"ContainerStarted","Data":"07274f7a4ba8d98ac4e857d7d1651644d3e53efed1f9a40cdee46adfb3af28d9"} Feb 18 06:08:36 crc kubenswrapper[4869]: I0218 06:08:36.066944 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.066907089 podStartE2EDuration="2.066907089s" podCreationTimestamp="2026-02-18 06:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:36.064468008 +0000 UTC m=+1213.233556240" watchObservedRunningTime="2026-02-18 06:08:36.066907089 +0000 UTC m=+1213.235995351" Feb 18 06:08:36 crc kubenswrapper[4869]: W0218 06:08:36.265668 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b848f50_bd92_4c86_8f5e_64c4fd3e2521.slice/crio-1d5a75ec653f34578d7136e9a249b4213b54191c1a3f4b9ca43aeb7bab9b083a WatchSource:0}: Error finding container 1d5a75ec653f34578d7136e9a249b4213b54191c1a3f4b9ca43aeb7bab9b083a: Status 404 returned error can't find the container with id 1d5a75ec653f34578d7136e9a249b4213b54191c1a3f4b9ca43aeb7bab9b083a Feb 18 06:08:36 crc kubenswrapper[4869]: I0218 06:08:36.267177 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.072445 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b848f50-bd92-4c86-8f5e-64c4fd3e2521","Type":"ContainerStarted","Data":"55921ab6237e1d04fe4a0194755cb2d59464043e75b59a034449bee692dc6fd7"} Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.072794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b848f50-bd92-4c86-8f5e-64c4fd3e2521","Type":"ContainerStarted","Data":"8eac3bca782be4229889ed9b13e1763f23cd178f854f3e46ac5af3bbd56aa2fe"} Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.072807 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b848f50-bd92-4c86-8f5e-64c4fd3e2521","Type":"ContainerStarted","Data":"1d5a75ec653f34578d7136e9a249b4213b54191c1a3f4b9ca43aeb7bab9b083a"} Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.144778 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.144733104 podStartE2EDuration="2.144733104s" podCreationTimestamp="2026-02-18 06:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:37.138971727 +0000 UTC m=+1214.308060009" watchObservedRunningTime="2026-02-18 06:08:37.144733104 +0000 UTC m=+1214.313821356" Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.354144 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:08:37 crc kubenswrapper[4869]: I0218 06:08:37.354247 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:08:39 crc kubenswrapper[4869]: I0218 06:08:39.398879 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:08:42 crc kubenswrapper[4869]: I0218 06:08:42.354722 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:08:42 crc kubenswrapper[4869]: I0218 06:08:42.355147 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:08:43 crc kubenswrapper[4869]: I0218 06:08:43.365919 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:43 crc kubenswrapper[4869]: I0218 06:08:43.365918 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:44 crc kubenswrapper[4869]: I0218 06:08:44.399482 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:08:44 crc kubenswrapper[4869]: I0218 06:08:44.461470 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:08:45 crc kubenswrapper[4869]: I0218 06:08:45.201704 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:08:45 crc kubenswrapper[4869]: I0218 06:08:45.740061 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4869]: I0218 06:08:45.740399 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:46 crc kubenswrapper[4869]: I0218 06:08:46.756095 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b848f50-bd92-4c86-8f5e-64c4fd3e2521" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:46 crc kubenswrapper[4869]: I0218 06:08:46.756704 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b848f50-bd92-4c86-8f5e-64c4fd3e2521" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:49 crc kubenswrapper[4869]: I0218 06:08:49.080933 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 06:08:52 crc kubenswrapper[4869]: I0218 06:08:52.363665 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:08:52 crc kubenswrapper[4869]: I0218 06:08:52.364154 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:08:52 crc kubenswrapper[4869]: I0218 06:08:52.370214 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:08:52 crc kubenswrapper[4869]: I0218 06:08:52.370441 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:08:55 crc kubenswrapper[4869]: I0218 06:08:55.749307 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:08:55 crc kubenswrapper[4869]: I0218 06:08:55.750424 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:08:55 crc kubenswrapper[4869]: I0218 06:08:55.751457 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:08:55 crc kubenswrapper[4869]: I0218 06:08:55.759248 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:08:56 crc kubenswrapper[4869]: I0218 06:08:56.296841 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:08:56 crc kubenswrapper[4869]: I0218 06:08:56.308694 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4869]: I0218 06:09:03.834457 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:04 crc kubenswrapper[4869]: I0218 06:09:04.665134 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:07 crc kubenswrapper[4869]: I0218 06:09:07.994703 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="rabbitmq" containerID="cri-o://bd73329caf191e3e88645445f11a08e2209ad944a0492b42fb994fe2228de34e" gracePeriod=604796 Feb 18 06:09:09 crc kubenswrapper[4869]: I0218 06:09:09.008700 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="rabbitmq" containerID="cri-o://9d2471433996dbecba9e05447defe62056ea851a4aa9534dca6057e96725ce44" gracePeriod=604796 Feb 18 06:09:13 crc kubenswrapper[4869]: I0218 06:09:13.362532 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Feb 18 06:09:13 crc kubenswrapper[4869]: I0218 06:09:13.648976 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.467873 4869 generic.go:334] "Generic (PLEG): container finished" podID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerID="bd73329caf191e3e88645445f11a08e2209ad944a0492b42fb994fe2228de34e" exitCode=0 Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.468136 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerDied","Data":"bd73329caf191e3e88645445f11a08e2209ad944a0492b42fb994fe2228de34e"} Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.548217 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.612892 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.612992 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613110 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613159 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613228 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613271 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jhxv\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613341 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613452 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613505 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613615 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.613638 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf\") pod \"15f90eb3-a8d8-489d-b8f6-41046e14e165\" (UID: \"15f90eb3-a8d8-489d-b8f6-41046e14e165\") " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.615257 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.615575 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.621414 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.622038 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.625547 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.628970 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info" (OuterVolumeSpecName: "pod-info") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.633188 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.641952 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv" (OuterVolumeSpecName: "kube-api-access-9jhxv") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "kube-api-access-9jhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718630 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718670 4869 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15f90eb3-a8d8-489d-b8f6-41046e14e165-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718679 4869 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15f90eb3-a8d8-489d-b8f6-41046e14e165-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718688 4869 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718696 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718717 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718725 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.718735 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jhxv\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-kube-api-access-9jhxv\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.731179 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data" (OuterVolumeSpecName: "config-data") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.757680 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf" (OuterVolumeSpecName: "server-conf") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.781049 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.820544 4869 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.820577 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.820585 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15f90eb3-a8d8-489d-b8f6-41046e14e165-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.838889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "15f90eb3-a8d8-489d-b8f6-41046e14e165" (UID: "15f90eb3-a8d8-489d-b8f6-41046e14e165"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:14 crc kubenswrapper[4869]: I0218 06:09:14.921992 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15f90eb3-a8d8-489d-b8f6-41046e14e165-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.490941 4869 generic.go:334] "Generic (PLEG): container finished" podID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerID="9d2471433996dbecba9e05447defe62056ea851a4aa9534dca6057e96725ce44" exitCode=0 Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.491277 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerDied","Data":"9d2471433996dbecba9e05447defe62056ea851a4aa9534dca6057e96725ce44"} Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.496550 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"15f90eb3-a8d8-489d-b8f6-41046e14e165","Type":"ContainerDied","Data":"1cbbb86a3d8c4cbfb41f93e07d9da23bcfa04b2856c728a8b595c1d0730ce287"} Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.496602 4869 scope.go:117] "RemoveContainer" containerID="bd73329caf191e3e88645445f11a08e2209ad944a0492b42fb994fe2228de34e" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.496687 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.531782 4869 scope.go:117] "RemoveContainer" containerID="00eeaaab546dfd1b221345a3f48c9875681142cf110a173f4094afe2a48c845c" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.566997 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.582826 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.607511 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:15 crc kubenswrapper[4869]: E0218 06:09:15.607996 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="setup-container" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.608015 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="setup-container" Feb 18 06:09:15 crc kubenswrapper[4869]: E0218 06:09:15.608051 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="rabbitmq" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.608058 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="rabbitmq" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.608251 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" containerName="rabbitmq" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.609368 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.616309 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.616523 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.616631 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.616844 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dtzm6" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.616995 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.617098 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.617217 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.634903 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.694263 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769050 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769146 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769167 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769186 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769216 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769241 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769258 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769281 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769308 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769375 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmxmj\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769442 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie\") pod \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\" (UID: \"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90\") " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769728 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769797 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769827 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35973c92-2b94-4366-aa4b-637920311279-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769895 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769941 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35973c92-2b94-4366-aa4b-637920311279-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769967 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.769984 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8tmb\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-kube-api-access-c8tmb\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.770012 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-config-data\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.770029 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.771253 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.772564 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.777614 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.779548 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.779675 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info" (OuterVolumeSpecName: "pod-info") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.781759 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.781942 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj" (OuterVolumeSpecName: "kube-api-access-qmxmj") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "kube-api-access-qmxmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.786413 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.820814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data" (OuterVolumeSpecName: "config-data") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873250 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873297 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873332 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35973c92-2b94-4366-aa4b-637920311279-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873426 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873451 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35973c92-2b94-4366-aa4b-637920311279-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873513 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873554 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tmb\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-kube-api-access-c8tmb\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873598 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-config-data\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873617 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873714 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873737 4869 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873754 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873799 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873814 4869 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873824 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873834 4869 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873847 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmxmj\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-kube-api-access-qmxmj\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.873860 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.877159 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.877578 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.880956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.885502 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.894685 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-config-data\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.895429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.898411 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8tmb\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-kube-api-access-c8tmb\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.902090 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35973c92-2b94-4366-aa4b-637920311279-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.916755 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35973c92-2b94-4366-aa4b-637920311279-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.930878 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35973c92-2b94-4366-aa4b-637920311279-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.946876 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf" (OuterVolumeSpecName: "server-conf") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.950503 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.960314 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35973c92-2b94-4366-aa4b-637920311279-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.969434 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"35973c92-2b94-4366-aa4b-637920311279\") " pod="openstack/rabbitmq-server-0" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.976306 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:15 crc kubenswrapper[4869]: I0218 06:09:15.976335 4869 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.003942 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" (UID: "3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.008389 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.078326 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.509613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90","Type":"ContainerDied","Data":"cbff143e98c3b11a108fb99da87eed7d495bdf541a0319cc73cb3c691cf43ea3"} Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.509633 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.510169 4869 scope.go:117] "RemoveContainer" containerID="9d2471433996dbecba9e05447defe62056ea851a4aa9534dca6057e96725ce44" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.519322 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.536085 4869 scope.go:117] "RemoveContainer" containerID="1683808ddc02a9672a20148fe3fe1be88215576ea37bf98a1fdafb9845128cd9" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.561867 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.583808 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.612994 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:16 crc kubenswrapper[4869]: E0218 06:09:16.613401 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="setup-container" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.613414 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="setup-container" Feb 18 06:09:16 crc kubenswrapper[4869]: E0218 06:09:16.613549 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="rabbitmq" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.613560 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="rabbitmq" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.620127 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" containerName="rabbitmq" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.621222 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.629733 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4bnpk" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.630298 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.630455 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.630621 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.630743 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.630880 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.631074 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.640373 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.688968 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb17f4cc-a879-4fb2-be2e-4e0e47167746-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689526 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689615 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhlkq\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-kube-api-access-dhlkq\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689702 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689810 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689898 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.689928 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.690001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb17f4cc-a879-4fb2-be2e-4e0e47167746-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.690064 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhlkq\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-kube-api-access-dhlkq\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791547 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791599 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791621 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791658 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb17f4cc-a879-4fb2-be2e-4e0e47167746-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb17f4cc-a879-4fb2-be2e-4e0e47167746-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791768 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791827 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.791880 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.794602 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.795633 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.796100 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.797172 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.800295 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.807285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.807444 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.807961 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb17f4cc-a879-4fb2-be2e-4e0e47167746-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.808521 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb17f4cc-a879-4fb2-be2e-4e0e47167746-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.813331 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb17f4cc-a879-4fb2-be2e-4e0e47167746-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.818920 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhlkq\" (UniqueName: \"kubernetes.io/projected/eb17f4cc-a879-4fb2-be2e-4e0e47167746-kube-api-access-dhlkq\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.834946 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eb17f4cc-a879-4fb2-be2e-4e0e47167746\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:16 crc kubenswrapper[4869]: I0218 06:09:16.993879 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:17 crc kubenswrapper[4869]: W0218 06:09:17.463865 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb17f4cc_a879_4fb2_be2e_4e0e47167746.slice/crio-488121f3d278b5e36008f7d21abc5c4286910582a7ba4c602cb0ea6b34097fab WatchSource:0}: Error finding container 488121f3d278b5e36008f7d21abc5c4286910582a7ba4c602cb0ea6b34097fab: Status 404 returned error can't find the container with id 488121f3d278b5e36008f7d21abc5c4286910582a7ba4c602cb0ea6b34097fab Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.468109 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.493097 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f90eb3-a8d8-489d-b8f6-41046e14e165" path="/var/lib/kubelet/pods/15f90eb3-a8d8-489d-b8f6-41046e14e165/volumes" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.494666 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90" path="/var/lib/kubelet/pods/3620bb0b-e8d0-4dd0-a2f1-6960f4ca8e90/volumes" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.527205 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35973c92-2b94-4366-aa4b-637920311279","Type":"ContainerStarted","Data":"c6cc6120d68f0b410aef84d68dd795fdf12ada9068777fd536790ed6a5f548f4"} Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.529589 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eb17f4cc-a879-4fb2-be2e-4e0e47167746","Type":"ContainerStarted","Data":"488121f3d278b5e36008f7d21abc5c4286910582a7ba4c602cb0ea6b34097fab"} Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.574579 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.576631 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.589994 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.628636 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724088 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724167 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724210 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724405 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f28c\" (UniqueName: \"kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724831 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.724932 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826738 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826817 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826873 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826911 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826930 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.826960 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f28c\" (UniqueName: \"kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.827005 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.827945 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.828480 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.828997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.829476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.829984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.830449 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.848651 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f28c\" (UniqueName: \"kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c\") pod \"dnsmasq-dns-79bd4cc8c9-d2btp\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:17 crc kubenswrapper[4869]: I0218 06:09:17.961514 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:18 crc kubenswrapper[4869]: I0218 06:09:18.545955 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35973c92-2b94-4366-aa4b-637920311279","Type":"ContainerStarted","Data":"8d52e7e27a7cfd8273cf0231eb29e12d538ec748f65c6bb0ceff6fa4be2a69c2"} Feb 18 06:09:18 crc kubenswrapper[4869]: W0218 06:09:18.722019 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dffd56c_abee_43fa_802d_541d93f73ed5.slice/crio-0c4e12ab8341796b947b4a82a705e808bb406c056b9ea5b6f7b8f91bbac198d0 WatchSource:0}: Error finding container 0c4e12ab8341796b947b4a82a705e808bb406c056b9ea5b6f7b8f91bbac198d0: Status 404 returned error can't find the container with id 0c4e12ab8341796b947b4a82a705e808bb406c056b9ea5b6f7b8f91bbac198d0 Feb 18 06:09:18 crc kubenswrapper[4869]: I0218 06:09:18.731214 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:19 crc kubenswrapper[4869]: I0218 06:09:19.564549 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eb17f4cc-a879-4fb2-be2e-4e0e47167746","Type":"ContainerStarted","Data":"c50935a1261d4de47f8f3365b97f4b74a80b834d5ae2c7695e15dc8a26f3522c"} Feb 18 06:09:19 crc kubenswrapper[4869]: I0218 06:09:19.567100 4869 generic.go:334] "Generic (PLEG): container finished" podID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerID="bcb25fa0c07dc2348e79983332312500a93f764ea3d0edbedd169824fa0aedf9" exitCode=0 Feb 18 06:09:19 crc kubenswrapper[4869]: I0218 06:09:19.567189 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" event={"ID":"6dffd56c-abee-43fa-802d-541d93f73ed5","Type":"ContainerDied","Data":"bcb25fa0c07dc2348e79983332312500a93f764ea3d0edbedd169824fa0aedf9"} Feb 18 06:09:19 crc kubenswrapper[4869]: I0218 06:09:19.567278 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" event={"ID":"6dffd56c-abee-43fa-802d-541d93f73ed5","Type":"ContainerStarted","Data":"0c4e12ab8341796b947b4a82a705e808bb406c056b9ea5b6f7b8f91bbac198d0"} Feb 18 06:09:20 crc kubenswrapper[4869]: I0218 06:09:20.583535 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" event={"ID":"6dffd56c-abee-43fa-802d-541d93f73ed5","Type":"ContainerStarted","Data":"2760e6310db792649598e759918f891fe968d52aef3d7a1ee29a04155fae2b60"} Feb 18 06:09:20 crc kubenswrapper[4869]: I0218 06:09:20.583950 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:20 crc kubenswrapper[4869]: I0218 06:09:20.619459 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" podStartSLOduration=3.619439055 podStartE2EDuration="3.619439055s" podCreationTimestamp="2026-02-18 06:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:20.607367754 +0000 UTC m=+1257.776455996" watchObservedRunningTime="2026-02-18 06:09:20.619439055 +0000 UTC m=+1257.788527287" Feb 18 06:09:27 crc kubenswrapper[4869]: I0218 06:09:27.963372 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.027804 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.028051 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="dnsmasq-dns" containerID="cri-o://f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c" gracePeriod=10 Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.222687 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-n44p5"] Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.230727 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.248430 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-n44p5"] Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341230 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-svc\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341286 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341314 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w55l\" (UniqueName: \"kubernetes.io/projected/57134f39-764c-4164-a5ab-9392660d554b-kube-api-access-9w55l\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341572 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.341901 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-config\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.342040 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444234 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444316 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-svc\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444357 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444386 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w55l\" (UniqueName: \"kubernetes.io/projected/57134f39-764c-4164-a5ab-9392660d554b-kube-api-access-9w55l\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444453 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444498 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.444534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-config\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.445637 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.445688 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-config\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.445922 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.446118 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.446137 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-dns-svc\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.447006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57134f39-764c-4164-a5ab-9392660d554b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.471012 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w55l\" (UniqueName: \"kubernetes.io/projected/57134f39-764c-4164-a5ab-9392660d554b-kube-api-access-9w55l\") pod \"dnsmasq-dns-55478c4467-n44p5\" (UID: \"57134f39-764c-4164-a5ab-9392660d554b\") " pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.552263 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.555330 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648137 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648183 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648223 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648273 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648377 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm8d4\" (UniqueName: \"kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.648422 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb\") pod \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\" (UID: \"a3515130-3f9f-42ab-8bc7-6d357e1d645a\") " Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.671655 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4" (OuterVolumeSpecName: "kube-api-access-cm8d4") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "kube-api-access-cm8d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.705519 4869 generic.go:334] "Generic (PLEG): container finished" podID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerID="f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c" exitCode=0 Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.705568 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" event={"ID":"a3515130-3f9f-42ab-8bc7-6d357e1d645a","Type":"ContainerDied","Data":"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c"} Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.705595 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" event={"ID":"a3515130-3f9f-42ab-8bc7-6d357e1d645a","Type":"ContainerDied","Data":"b87cdedd15337500e74b18d0cbe47da17a30787a9469d0ce85670611979fffed"} Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.705641 4869 scope.go:117] "RemoveContainer" containerID="f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.705635 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-hz2g7" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.718562 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config" (OuterVolumeSpecName: "config") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.722003 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.732062 4869 scope.go:117] "RemoveContainer" containerID="3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.733794 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.734715 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.751869 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3515130-3f9f-42ab-8bc7-6d357e1d645a" (UID: "a3515130-3f9f-42ab-8bc7-6d357e1d645a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753070 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753096 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753107 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753117 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753126 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm8d4\" (UniqueName: \"kubernetes.io/projected/a3515130-3f9f-42ab-8bc7-6d357e1d645a-kube-api-access-cm8d4\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.753136 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3515130-3f9f-42ab-8bc7-6d357e1d645a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.768685 4869 scope.go:117] "RemoveContainer" containerID="f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c" Feb 18 06:09:28 crc kubenswrapper[4869]: E0218 06:09:28.769853 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c\": container with ID starting with f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c not found: ID does not exist" containerID="f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.769899 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c"} err="failed to get container status \"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c\": rpc error: code = NotFound desc = could not find container \"f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c\": container with ID starting with f17beb1933a37194a53aa2bcc4b11f95a8db3d48f1f0ae74716c7849df76137c not found: ID does not exist" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.769930 4869 scope.go:117] "RemoveContainer" containerID="3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003" Feb 18 06:09:28 crc kubenswrapper[4869]: E0218 06:09:28.771277 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003\": container with ID starting with 3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003 not found: ID does not exist" containerID="3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003" Feb 18 06:09:28 crc kubenswrapper[4869]: I0218 06:09:28.771306 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003"} err="failed to get container status \"3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003\": rpc error: code = NotFound desc = could not find container \"3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003\": container with ID starting with 3ef2c1796fc3362d9041e6c069a247f92e632f34768d72c11a310397dd391003 not found: ID does not exist" Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.052446 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.069943 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-hz2g7"] Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.086267 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-n44p5"] Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.489832 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" path="/var/lib/kubelet/pods/a3515130-3f9f-42ab-8bc7-6d357e1d645a/volumes" Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.728306 4869 generic.go:334] "Generic (PLEG): container finished" podID="57134f39-764c-4164-a5ab-9392660d554b" containerID="db64e2e85970a9ce511f7fad7f1ad23fed54da756549a22d94a49de81a399346" exitCode=0 Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.728391 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-n44p5" event={"ID":"57134f39-764c-4164-a5ab-9392660d554b","Type":"ContainerDied","Data":"db64e2e85970a9ce511f7fad7f1ad23fed54da756549a22d94a49de81a399346"} Feb 18 06:09:29 crc kubenswrapper[4869]: I0218 06:09:29.728809 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-n44p5" event={"ID":"57134f39-764c-4164-a5ab-9392660d554b","Type":"ContainerStarted","Data":"a91d170ba8a237106efcb7497509b287ff005fbf278eb9009a79bea873ca0027"} Feb 18 06:09:30 crc kubenswrapper[4869]: I0218 06:09:30.739180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-n44p5" event={"ID":"57134f39-764c-4164-a5ab-9392660d554b","Type":"ContainerStarted","Data":"4eb45337f31fc29bab9b74ce95fec422b0aa5a55cd185412d67dd21e2c8f5036"} Feb 18 06:09:30 crc kubenswrapper[4869]: I0218 06:09:30.739679 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:30 crc kubenswrapper[4869]: I0218 06:09:30.761035 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-n44p5" podStartSLOduration=2.7610152660000002 podStartE2EDuration="2.761015266s" podCreationTimestamp="2026-02-18 06:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:30.759713105 +0000 UTC m=+1267.928801347" watchObservedRunningTime="2026-02-18 06:09:30.761015266 +0000 UTC m=+1267.930103498" Feb 18 06:09:38 crc kubenswrapper[4869]: I0218 06:09:38.554733 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-n44p5" Feb 18 06:09:38 crc kubenswrapper[4869]: I0218 06:09:38.633790 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:38 crc kubenswrapper[4869]: I0218 06:09:38.634620 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="dnsmasq-dns" containerID="cri-o://2760e6310db792649598e759918f891fe968d52aef3d7a1ee29a04155fae2b60" gracePeriod=10 Feb 18 06:09:38 crc kubenswrapper[4869]: I0218 06:09:38.822428 4869 generic.go:334] "Generic (PLEG): container finished" podID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerID="2760e6310db792649598e759918f891fe968d52aef3d7a1ee29a04155fae2b60" exitCode=0 Feb 18 06:09:38 crc kubenswrapper[4869]: I0218 06:09:38.822574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" event={"ID":"6dffd56c-abee-43fa-802d-541d93f73ed5","Type":"ContainerDied","Data":"2760e6310db792649598e759918f891fe968d52aef3d7a1ee29a04155fae2b60"} Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.113952 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.260725 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f28c\" (UniqueName: \"kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.260818 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.260922 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.260992 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.261075 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.261290 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.261372 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc\") pod \"6dffd56c-abee-43fa-802d-541d93f73ed5\" (UID: \"6dffd56c-abee-43fa-802d-541d93f73ed5\") " Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.284713 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c" (OuterVolumeSpecName: "kube-api-access-7f28c") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "kube-api-access-7f28c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.315641 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.324333 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.325174 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.329596 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.357890 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config" (OuterVolumeSpecName: "config") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.359806 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6dffd56c-abee-43fa-802d-541d93f73ed5" (UID: "6dffd56c-abee-43fa-802d-541d93f73ed5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363886 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f28c\" (UniqueName: \"kubernetes.io/projected/6dffd56c-abee-43fa-802d-541d93f73ed5-kube-api-access-7f28c\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363919 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363933 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363949 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363969 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363985 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.363999 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dffd56c-abee-43fa-802d-541d93f73ed5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.833133 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" event={"ID":"6dffd56c-abee-43fa-802d-541d93f73ed5","Type":"ContainerDied","Data":"0c4e12ab8341796b947b4a82a705e808bb406c056b9ea5b6f7b8f91bbac198d0"} Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.833594 4869 scope.go:117] "RemoveContainer" containerID="2760e6310db792649598e759918f891fe968d52aef3d7a1ee29a04155fae2b60" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.833198 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-d2btp" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.873866 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.874345 4869 scope.go:117] "RemoveContainer" containerID="bcb25fa0c07dc2348e79983332312500a93f764ea3d0edbedd169824fa0aedf9" Feb 18 06:09:39 crc kubenswrapper[4869]: I0218 06:09:39.886160 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-d2btp"] Feb 18 06:09:40 crc kubenswrapper[4869]: I0218 06:09:40.132405 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:09:40 crc kubenswrapper[4869]: I0218 06:09:40.132960 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:09:41 crc kubenswrapper[4869]: I0218 06:09:41.492661 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" path="/var/lib/kubelet/pods/6dffd56c-abee-43fa-802d-541d93f73ed5/volumes" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.442197 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8"] Feb 18 06:09:47 crc kubenswrapper[4869]: E0218 06:09:47.443420 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.443436 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: E0218 06:09:47.443455 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="init" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.443463 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="init" Feb 18 06:09:47 crc kubenswrapper[4869]: E0218 06:09:47.443489 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="init" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.443497 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="init" Feb 18 06:09:47 crc kubenswrapper[4869]: E0218 06:09:47.443521 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.443530 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.444112 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3515130-3f9f-42ab-8bc7-6d357e1d645a" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.444133 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dffd56c-abee-43fa-802d-541d93f73ed5" containerName="dnsmasq-dns" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.445227 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.447614 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.448158 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.448953 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.450881 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.464919 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8"] Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.545731 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.546375 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dft\" (UniqueName: \"kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.546543 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.546831 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.650164 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.650887 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dft\" (UniqueName: \"kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.650970 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.651072 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.658265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.659582 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.662506 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.669977 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dft\" (UniqueName: \"kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:47 crc kubenswrapper[4869]: I0218 06:09:47.774335 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:09:48 crc kubenswrapper[4869]: I0218 06:09:48.317879 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8"] Feb 18 06:09:48 crc kubenswrapper[4869]: W0218 06:09:48.319481 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda49b5bd1_87a9_4536_b8fb_5f32f8024b8a.slice/crio-26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b WatchSource:0}: Error finding container 26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b: Status 404 returned error can't find the container with id 26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b Feb 18 06:09:48 crc kubenswrapper[4869]: I0218 06:09:48.941947 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" event={"ID":"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a","Type":"ContainerStarted","Data":"26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b"} Feb 18 06:09:50 crc kubenswrapper[4869]: I0218 06:09:50.963490 4869 generic.go:334] "Generic (PLEG): container finished" podID="35973c92-2b94-4366-aa4b-637920311279" containerID="8d52e7e27a7cfd8273cf0231eb29e12d538ec748f65c6bb0ceff6fa4be2a69c2" exitCode=0 Feb 18 06:09:50 crc kubenswrapper[4869]: I0218 06:09:50.963537 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35973c92-2b94-4366-aa4b-637920311279","Type":"ContainerDied","Data":"8d52e7e27a7cfd8273cf0231eb29e12d538ec748f65c6bb0ceff6fa4be2a69c2"} Feb 18 06:09:51 crc kubenswrapper[4869]: I0218 06:09:51.976455 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35973c92-2b94-4366-aa4b-637920311279","Type":"ContainerStarted","Data":"0e69e5fc6e9368a591cbaa417f96958854ca45a83c403fc7b489ee0f460ab6e3"} Feb 18 06:09:51 crc kubenswrapper[4869]: I0218 06:09:51.977009 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 06:09:51 crc kubenswrapper[4869]: I0218 06:09:51.980648 4869 generic.go:334] "Generic (PLEG): container finished" podID="eb17f4cc-a879-4fb2-be2e-4e0e47167746" containerID="c50935a1261d4de47f8f3365b97f4b74a80b834d5ae2c7695e15dc8a26f3522c" exitCode=0 Feb 18 06:09:51 crc kubenswrapper[4869]: I0218 06:09:51.980688 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eb17f4cc-a879-4fb2-be2e-4e0e47167746","Type":"ContainerDied","Data":"c50935a1261d4de47f8f3365b97f4b74a80b834d5ae2c7695e15dc8a26f3522c"} Feb 18 06:09:52 crc kubenswrapper[4869]: I0218 06:09:52.006403 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.006381349 podStartE2EDuration="37.006381349s" podCreationTimestamp="2026-02-18 06:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:51.994645226 +0000 UTC m=+1289.163733458" watchObservedRunningTime="2026-02-18 06:09:52.006381349 +0000 UTC m=+1289.175469581" Feb 18 06:09:58 crc kubenswrapper[4869]: I0218 06:09:58.083336 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" podStartSLOduration=1.631862248 podStartE2EDuration="11.083309707s" podCreationTimestamp="2026-02-18 06:09:47 +0000 UTC" firstStartedPulling="2026-02-18 06:09:48.322074961 +0000 UTC m=+1285.491163203" lastFinishedPulling="2026-02-18 06:09:57.7735224 +0000 UTC m=+1294.942610662" observedRunningTime="2026-02-18 06:09:58.079827442 +0000 UTC m=+1295.248915674" watchObservedRunningTime="2026-02-18 06:09:58.083309707 +0000 UTC m=+1295.252397969" Feb 18 06:09:59 crc kubenswrapper[4869]: I0218 06:09:59.075140 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" event={"ID":"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a","Type":"ContainerStarted","Data":"3628f32ecca77a513f6cc3cbbfd2f9abf6ed05d8cd15cd23257b6731017b12be"} Feb 18 06:09:59 crc kubenswrapper[4869]: I0218 06:09:59.076882 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eb17f4cc-a879-4fb2-be2e-4e0e47167746","Type":"ContainerStarted","Data":"7bf563b705dda3e4476129e126990518a4479b5b5f7cebd301b6526ee85a1a6f"} Feb 18 06:09:59 crc kubenswrapper[4869]: I0218 06:09:59.077135 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:09:59 crc kubenswrapper[4869]: I0218 06:09:59.121786 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.121768015 podStartE2EDuration="43.121768015s" podCreationTimestamp="2026-02-18 06:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:59.109137071 +0000 UTC m=+1296.278225303" watchObservedRunningTime="2026-02-18 06:09:59.121768015 +0000 UTC m=+1296.290856247" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.594461 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.597409 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.612658 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.636921 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.637076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.637279 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcc4\" (UniqueName: \"kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.738893 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.738997 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.739081 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcc4\" (UniqueName: \"kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.740249 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.741006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.764399 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcc4\" (UniqueName: \"kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4\") pod \"redhat-marketplace-mwcql\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:00 crc kubenswrapper[4869]: I0218 06:10:00.916238 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:01 crc kubenswrapper[4869]: I0218 06:10:01.393828 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:02 crc kubenswrapper[4869]: I0218 06:10:02.108826 4869 generic.go:334] "Generic (PLEG): container finished" podID="00660609-49d3-4cf5-a722-35739a5493e8" containerID="ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33" exitCode=0 Feb 18 06:10:02 crc kubenswrapper[4869]: I0218 06:10:02.108951 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerDied","Data":"ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33"} Feb 18 06:10:02 crc kubenswrapper[4869]: I0218 06:10:02.109511 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerStarted","Data":"a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512"} Feb 18 06:10:04 crc kubenswrapper[4869]: I0218 06:10:04.135226 4869 generic.go:334] "Generic (PLEG): container finished" podID="00660609-49d3-4cf5-a722-35739a5493e8" containerID="d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e" exitCode=0 Feb 18 06:10:04 crc kubenswrapper[4869]: I0218 06:10:04.136044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerDied","Data":"d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e"} Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.152996 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerStarted","Data":"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3"} Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.173900 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mwcql" podStartSLOduration=2.73150253 podStartE2EDuration="5.173883513s" podCreationTimestamp="2026-02-18 06:10:00 +0000 UTC" firstStartedPulling="2026-02-18 06:10:02.110953208 +0000 UTC m=+1299.280041440" lastFinishedPulling="2026-02-18 06:10:04.553334191 +0000 UTC m=+1301.722422423" observedRunningTime="2026-02-18 06:10:05.171268161 +0000 UTC m=+1302.340356403" watchObservedRunningTime="2026-02-18 06:10:05.173883513 +0000 UTC m=+1302.342971755" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.579939 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.582657 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.606721 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.643697 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rkz\" (UniqueName: \"kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.643795 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.643871 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.745022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.745135 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rkz\" (UniqueName: \"kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.745190 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.745608 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.745860 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.768548 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rkz\" (UniqueName: \"kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz\") pod \"redhat-operators-bbqv7\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:05 crc kubenswrapper[4869]: I0218 06:10:05.905652 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:06 crc kubenswrapper[4869]: I0218 06:10:06.012936 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 06:10:06 crc kubenswrapper[4869]: I0218 06:10:06.515305 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:07 crc kubenswrapper[4869]: I0218 06:10:07.186501 4869 generic.go:334] "Generic (PLEG): container finished" podID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerID="1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81" exitCode=0 Feb 18 06:10:07 crc kubenswrapper[4869]: I0218 06:10:07.186602 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerDied","Data":"1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81"} Feb 18 06:10:07 crc kubenswrapper[4869]: I0218 06:10:07.186806 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerStarted","Data":"3137c07753dad0d89c7a63f3765c449fd09d4ed8a40d7998a6431078f35b1c1d"} Feb 18 06:10:08 crc kubenswrapper[4869]: I0218 06:10:08.199492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerStarted","Data":"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb"} Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.132411 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.133014 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.222214 4869 generic.go:334] "Generic (PLEG): container finished" podID="a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" containerID="3628f32ecca77a513f6cc3cbbfd2f9abf6ed05d8cd15cd23257b6731017b12be" exitCode=0 Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.222302 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" event={"ID":"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a","Type":"ContainerDied","Data":"3628f32ecca77a513f6cc3cbbfd2f9abf6ed05d8cd15cd23257b6731017b12be"} Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.228004 4869 generic.go:334] "Generic (PLEG): container finished" podID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerID="47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb" exitCode=0 Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.228055 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerDied","Data":"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb"} Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.917083 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:10 crc kubenswrapper[4869]: I0218 06:10:10.918468 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:11 crc kubenswrapper[4869]: I0218 06:10:11.974712 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mwcql" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="registry-server" probeResult="failure" output=< Feb 18 06:10:11 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:10:11 crc kubenswrapper[4869]: > Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.008591 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.194648 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle\") pod \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.194802 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dft\" (UniqueName: \"kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft\") pod \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.194868 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam\") pod \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.195042 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory\") pod \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\" (UID: \"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a\") " Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.200190 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft" (OuterVolumeSpecName: "kube-api-access-c4dft") pod "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" (UID: "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a"). InnerVolumeSpecName "kube-api-access-c4dft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.207182 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" (UID: "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.227188 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" (UID: "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.232005 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory" (OuterVolumeSpecName: "inventory") pod "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" (UID: "a49b5bd1-87a9-4536-b8fb-5f32f8024b8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.297875 4869 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.297909 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4dft\" (UniqueName: \"kubernetes.io/projected/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-kube-api-access-c4dft\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.297920 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.297929 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a49b5bd1-87a9-4536-b8fb-5f32f8024b8a-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.335829 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj"] Feb 18 06:10:12 crc kubenswrapper[4869]: E0218 06:10:12.336208 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.336229 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.336409 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49b5bd1-87a9-4536-b8fb-5f32f8024b8a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.337103 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.353235 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj"] Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.399864 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.399959 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.400101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkj5t\" (UniqueName: \"kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.501483 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkj5t\" (UniqueName: \"kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.502363 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.502569 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.508100 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.509055 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.520301 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkj5t\" (UniqueName: \"kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h9skj\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.614549 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerStarted","Data":"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad"} Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.616801 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" event={"ID":"a49b5bd1-87a9-4536-b8fb-5f32f8024b8a","Type":"ContainerDied","Data":"26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b"} Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.616823 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26b3173b4a8ab27200d3d43b74e6f7aee6ff10ec4879de371050f7789e64fa6b" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.616854 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8" Feb 18 06:10:12 crc kubenswrapper[4869]: I0218 06:10:12.651710 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:13 crc kubenswrapper[4869]: I0218 06:10:13.726894 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bbqv7" podStartSLOduration=5.060905212 podStartE2EDuration="8.726877542s" podCreationTimestamp="2026-02-18 06:10:05 +0000 UTC" firstStartedPulling="2026-02-18 06:10:07.188739179 +0000 UTC m=+1304.357827411" lastFinishedPulling="2026-02-18 06:10:10.854711509 +0000 UTC m=+1308.023799741" observedRunningTime="2026-02-18 06:10:12.634907131 +0000 UTC m=+1309.803995383" watchObservedRunningTime="2026-02-18 06:10:13.726877542 +0000 UTC m=+1310.895965774" Feb 18 06:10:13 crc kubenswrapper[4869]: I0218 06:10:13.730313 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj"] Feb 18 06:10:13 crc kubenswrapper[4869]: W0218 06:10:13.736717 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c9d860_2ea7_4a81_b383_aae67501c7f8.slice/crio-60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b WatchSource:0}: Error finding container 60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b: Status 404 returned error can't find the container with id 60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b Feb 18 06:10:14 crc kubenswrapper[4869]: I0218 06:10:14.635032 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" event={"ID":"93c9d860-2ea7-4a81-b383-aae67501c7f8","Type":"ContainerStarted","Data":"883fc6d6ba2b1c2668b5a7af085c6bbdd4168ecb8791c3e4b3101f82174b4ff8"} Feb 18 06:10:14 crc kubenswrapper[4869]: I0218 06:10:14.635351 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" event={"ID":"93c9d860-2ea7-4a81-b383-aae67501c7f8","Type":"ContainerStarted","Data":"60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b"} Feb 18 06:10:14 crc kubenswrapper[4869]: I0218 06:10:14.655670 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" podStartSLOduration=2.232242138 podStartE2EDuration="2.655653526s" podCreationTimestamp="2026-02-18 06:10:12 +0000 UTC" firstStartedPulling="2026-02-18 06:10:13.739974103 +0000 UTC m=+1310.909062335" lastFinishedPulling="2026-02-18 06:10:14.163385491 +0000 UTC m=+1311.332473723" observedRunningTime="2026-02-18 06:10:14.650637847 +0000 UTC m=+1311.819726079" watchObservedRunningTime="2026-02-18 06:10:14.655653526 +0000 UTC m=+1311.824741758" Feb 18 06:10:15 crc kubenswrapper[4869]: I0218 06:10:15.906662 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:15 crc kubenswrapper[4869]: I0218 06:10:15.907018 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:16 crc kubenswrapper[4869]: I0218 06:10:16.957319 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bbqv7" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="registry-server" probeResult="failure" output=< Feb 18 06:10:16 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:10:16 crc kubenswrapper[4869]: > Feb 18 06:10:16 crc kubenswrapper[4869]: I0218 06:10:16.996952 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:17 crc kubenswrapper[4869]: I0218 06:10:17.665328 4869 generic.go:334] "Generic (PLEG): container finished" podID="93c9d860-2ea7-4a81-b383-aae67501c7f8" containerID="883fc6d6ba2b1c2668b5a7af085c6bbdd4168ecb8791c3e4b3101f82174b4ff8" exitCode=0 Feb 18 06:10:17 crc kubenswrapper[4869]: I0218 06:10:17.665414 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" event={"ID":"93c9d860-2ea7-4a81-b383-aae67501c7f8","Type":"ContainerDied","Data":"883fc6d6ba2b1c2668b5a7af085c6bbdd4168ecb8791c3e4b3101f82174b4ff8"} Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.136490 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.225610 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory\") pod \"93c9d860-2ea7-4a81-b383-aae67501c7f8\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.225715 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkj5t\" (UniqueName: \"kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t\") pod \"93c9d860-2ea7-4a81-b383-aae67501c7f8\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.225780 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam\") pod \"93c9d860-2ea7-4a81-b383-aae67501c7f8\" (UID: \"93c9d860-2ea7-4a81-b383-aae67501c7f8\") " Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.237346 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t" (OuterVolumeSpecName: "kube-api-access-mkj5t") pod "93c9d860-2ea7-4a81-b383-aae67501c7f8" (UID: "93c9d860-2ea7-4a81-b383-aae67501c7f8"). InnerVolumeSpecName "kube-api-access-mkj5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.259705 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93c9d860-2ea7-4a81-b383-aae67501c7f8" (UID: "93c9d860-2ea7-4a81-b383-aae67501c7f8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.263192 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory" (OuterVolumeSpecName: "inventory") pod "93c9d860-2ea7-4a81-b383-aae67501c7f8" (UID: "93c9d860-2ea7-4a81-b383-aae67501c7f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.327673 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.327720 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkj5t\" (UniqueName: \"kubernetes.io/projected/93c9d860-2ea7-4a81-b383-aae67501c7f8-kube-api-access-mkj5t\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.327734 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c9d860-2ea7-4a81-b383-aae67501c7f8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.686978 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" event={"ID":"93c9d860-2ea7-4a81-b383-aae67501c7f8","Type":"ContainerDied","Data":"60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b"} Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.687231 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a41ba1d0105bec6d07cb3a90e2cd4bbf416f1eb0cf6372c2d8ef458b1d008b" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.687436 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h9skj" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.759432 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw"] Feb 18 06:10:19 crc kubenswrapper[4869]: E0218 06:10:19.759951 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c9d860-2ea7-4a81-b383-aae67501c7f8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.759978 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c9d860-2ea7-4a81-b383-aae67501c7f8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.760252 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c9d860-2ea7-4a81-b383-aae67501c7f8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.761003 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.762823 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.762849 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.762973 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.765494 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.772568 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw"] Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.836462 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.836537 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.836672 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zd7\" (UniqueName: \"kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.837026 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.938327 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.938623 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.938774 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zd7\" (UniqueName: \"kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.938984 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.943004 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.944358 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.948399 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:19 crc kubenswrapper[4869]: I0218 06:10:19.955061 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zd7\" (UniqueName: \"kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:20 crc kubenswrapper[4869]: I0218 06:10:20.082093 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:10:20 crc kubenswrapper[4869]: W0218 06:10:20.517852 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26716094_10bf_4523_9c23_674dd4b7d517.slice/crio-3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d WatchSource:0}: Error finding container 3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d: Status 404 returned error can't find the container with id 3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d Feb 18 06:10:20 crc kubenswrapper[4869]: I0218 06:10:20.524819 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw"] Feb 18 06:10:20 crc kubenswrapper[4869]: I0218 06:10:20.697191 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" event={"ID":"26716094-10bf-4523-9c23-674dd4b7d517","Type":"ContainerStarted","Data":"3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d"} Feb 18 06:10:20 crc kubenswrapper[4869]: I0218 06:10:20.978539 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:21 crc kubenswrapper[4869]: I0218 06:10:21.027707 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:21 crc kubenswrapper[4869]: I0218 06:10:21.217129 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:21 crc kubenswrapper[4869]: I0218 06:10:21.710186 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" event={"ID":"26716094-10bf-4523-9c23-674dd4b7d517","Type":"ContainerStarted","Data":"c1307627db66327ffb9c8f6caed63da3b49f9b44d2ce0c1bb5c60f839e8bb1a8"} Feb 18 06:10:21 crc kubenswrapper[4869]: I0218 06:10:21.742924 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" podStartSLOduration=2.293930848 podStartE2EDuration="2.742895964s" podCreationTimestamp="2026-02-18 06:10:19 +0000 UTC" firstStartedPulling="2026-02-18 06:10:20.52117392 +0000 UTC m=+1317.690262162" lastFinishedPulling="2026-02-18 06:10:20.970139036 +0000 UTC m=+1318.139227278" observedRunningTime="2026-02-18 06:10:21.733810668 +0000 UTC m=+1318.902898930" watchObservedRunningTime="2026-02-18 06:10:21.742895964 +0000 UTC m=+1318.911984216" Feb 18 06:10:22 crc kubenswrapper[4869]: I0218 06:10:22.718653 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mwcql" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="registry-server" containerID="cri-o://9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3" gracePeriod=2 Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.156119 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.318793 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities\") pod \"00660609-49d3-4cf5-a722-35739a5493e8\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.318988 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwcc4\" (UniqueName: \"kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4\") pod \"00660609-49d3-4cf5-a722-35739a5493e8\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.319030 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content\") pod \"00660609-49d3-4cf5-a722-35739a5493e8\" (UID: \"00660609-49d3-4cf5-a722-35739a5493e8\") " Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.319724 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities" (OuterVolumeSpecName: "utilities") pod "00660609-49d3-4cf5-a722-35739a5493e8" (UID: "00660609-49d3-4cf5-a722-35739a5493e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.328324 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4" (OuterVolumeSpecName: "kube-api-access-cwcc4") pod "00660609-49d3-4cf5-a722-35739a5493e8" (UID: "00660609-49d3-4cf5-a722-35739a5493e8"). InnerVolumeSpecName "kube-api-access-cwcc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.350922 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00660609-49d3-4cf5-a722-35739a5493e8" (UID: "00660609-49d3-4cf5-a722-35739a5493e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.421705 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.421773 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00660609-49d3-4cf5-a722-35739a5493e8-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.421790 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwcc4\" (UniqueName: \"kubernetes.io/projected/00660609-49d3-4cf5-a722-35739a5493e8-kube-api-access-cwcc4\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.734265 4869 generic.go:334] "Generic (PLEG): container finished" podID="00660609-49d3-4cf5-a722-35739a5493e8" containerID="9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3" exitCode=0 Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.734921 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerDied","Data":"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3"} Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.734979 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwcql" event={"ID":"00660609-49d3-4cf5-a722-35739a5493e8","Type":"ContainerDied","Data":"a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512"} Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.735017 4869 scope.go:117] "RemoveContainer" containerID="9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.735574 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwcql" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.763251 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.767264 4869 scope.go:117] "RemoveContainer" containerID="d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.772947 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwcql"] Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.790419 4869 scope.go:117] "RemoveContainer" containerID="ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.845766 4869 scope.go:117] "RemoveContainer" containerID="9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3" Feb 18 06:10:23 crc kubenswrapper[4869]: E0218 06:10:23.846411 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3\": container with ID starting with 9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3 not found: ID does not exist" containerID="9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.846464 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3"} err="failed to get container status \"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3\": rpc error: code = NotFound desc = could not find container \"9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3\": container with ID starting with 9e2cb898aed11b2f3dcb2fb13dfaedbbbb9b43349ff29c04e16049bfc5d1e5c3 not found: ID does not exist" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.846497 4869 scope.go:117] "RemoveContainer" containerID="d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e" Feb 18 06:10:23 crc kubenswrapper[4869]: E0218 06:10:23.846887 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e\": container with ID starting with d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e not found: ID does not exist" containerID="d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.846920 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e"} err="failed to get container status \"d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e\": rpc error: code = NotFound desc = could not find container \"d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e\": container with ID starting with d17098b90e4e92cf8a57d1727e6c0002aa0c3804afb65195e66d9272dcac1d9e not found: ID does not exist" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.846937 4869 scope.go:117] "RemoveContainer" containerID="ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33" Feb 18 06:10:23 crc kubenswrapper[4869]: E0218 06:10:23.847335 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33\": container with ID starting with ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33 not found: ID does not exist" containerID="ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33" Feb 18 06:10:23 crc kubenswrapper[4869]: I0218 06:10:23.847376 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33"} err="failed to get container status \"ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33\": rpc error: code = NotFound desc = could not find container \"ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33\": container with ID starting with ec60f3acb7db01e023f3bfd190ce0162513aebdec316b5ccb0961afb0810dc33 not found: ID does not exist" Feb 18 06:10:25 crc kubenswrapper[4869]: I0218 06:10:25.487439 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00660609-49d3-4cf5-a722-35739a5493e8" path="/var/lib/kubelet/pods/00660609-49d3-4cf5-a722-35739a5493e8/volumes" Feb 18 06:10:25 crc kubenswrapper[4869]: I0218 06:10:25.965803 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:26 crc kubenswrapper[4869]: I0218 06:10:26.011326 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:26 crc kubenswrapper[4869]: I0218 06:10:26.629484 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:27 crc kubenswrapper[4869]: I0218 06:10:27.781662 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bbqv7" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="registry-server" containerID="cri-o://b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad" gracePeriod=2 Feb 18 06:10:27 crc kubenswrapper[4869]: I0218 06:10:27.845012 4869 scope.go:117] "RemoveContainer" containerID="f7fd390b28ec4651f960a40c7e41e2c84ed4e8119f7af25eb768835994beee99" Feb 18 06:10:27 crc kubenswrapper[4869]: I0218 06:10:27.990648 4869 scope.go:117] "RemoveContainer" containerID="4c41796edfd0b6b6aecb5cc67532abbdd93d18dc713953ffb4654d7a0eba0c9d" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.244379 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.417426 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content\") pod \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.417490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities\") pod \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.417568 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rkz\" (UniqueName: \"kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz\") pod \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\" (UID: \"e3e1e29c-fbd6-41f5-870f-b690738aaa0b\") " Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.418221 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities" (OuterVolumeSpecName: "utilities") pod "e3e1e29c-fbd6-41f5-870f-b690738aaa0b" (UID: "e3e1e29c-fbd6-41f5-870f-b690738aaa0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.424189 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz" (OuterVolumeSpecName: "kube-api-access-j6rkz") pod "e3e1e29c-fbd6-41f5-870f-b690738aaa0b" (UID: "e3e1e29c-fbd6-41f5-870f-b690738aaa0b"). InnerVolumeSpecName "kube-api-access-j6rkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.521506 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.521558 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rkz\" (UniqueName: \"kubernetes.io/projected/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-kube-api-access-j6rkz\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.532066 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3e1e29c-fbd6-41f5-870f-b690738aaa0b" (UID: "e3e1e29c-fbd6-41f5-870f-b690738aaa0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.624234 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e1e29c-fbd6-41f5-870f-b690738aaa0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.792456 4869 generic.go:334] "Generic (PLEG): container finished" podID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerID="b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad" exitCode=0 Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.792504 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerDied","Data":"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad"} Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.792517 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqv7" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.792532 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqv7" event={"ID":"e3e1e29c-fbd6-41f5-870f-b690738aaa0b","Type":"ContainerDied","Data":"3137c07753dad0d89c7a63f3765c449fd09d4ed8a40d7998a6431078f35b1c1d"} Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.792554 4869 scope.go:117] "RemoveContainer" containerID="b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.827336 4869 scope.go:117] "RemoveContainer" containerID="47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.842929 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.862669 4869 scope.go:117] "RemoveContainer" containerID="1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.864715 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bbqv7"] Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.884403 4869 scope.go:117] "RemoveContainer" containerID="b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad" Feb 18 06:10:28 crc kubenswrapper[4869]: E0218 06:10:28.884865 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad\": container with ID starting with b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad not found: ID does not exist" containerID="b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.884902 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad"} err="failed to get container status \"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad\": rpc error: code = NotFound desc = could not find container \"b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad\": container with ID starting with b9610420f90ca598af650b338beee899b76d76b8966ce99f0ea75c25b6c9c1ad not found: ID does not exist" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.884923 4869 scope.go:117] "RemoveContainer" containerID="47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb" Feb 18 06:10:28 crc kubenswrapper[4869]: E0218 06:10:28.885273 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb\": container with ID starting with 47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb not found: ID does not exist" containerID="47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.885294 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb"} err="failed to get container status \"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb\": rpc error: code = NotFound desc = could not find container \"47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb\": container with ID starting with 47f3f4c796d8e7ec4ef4c41542d02274d9662c519010b236ce2db1ee3e262fcb not found: ID does not exist" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.885306 4869 scope.go:117] "RemoveContainer" containerID="1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81" Feb 18 06:10:28 crc kubenswrapper[4869]: E0218 06:10:28.885546 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81\": container with ID starting with 1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81 not found: ID does not exist" containerID="1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81" Feb 18 06:10:28 crc kubenswrapper[4869]: I0218 06:10:28.885564 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81"} err="failed to get container status \"1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81\": rpc error: code = NotFound desc = could not find container \"1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81\": container with ID starting with 1ec120c6c6fe45d88d8c787b416e549684ddbff42df34ffeec3a1dfe99cffe81 not found: ID does not exist" Feb 18 06:10:28 crc kubenswrapper[4869]: E0218 06:10:28.891183 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache]" Feb 18 06:10:29 crc kubenswrapper[4869]: I0218 06:10:29.490586 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" path="/var/lib/kubelet/pods/e3e1e29c-fbd6-41f5-870f-b690738aaa0b/volumes" Feb 18 06:10:39 crc kubenswrapper[4869]: E0218 06:10:39.119565 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache]" Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.133126 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.133194 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.133246 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.134292 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.134390 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923" gracePeriod=600 Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.915675 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923" exitCode=0 Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.915839 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923"} Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.916317 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7"} Feb 18 06:10:40 crc kubenswrapper[4869]: I0218 06:10:40.916339 4869 scope.go:117] "RemoveContainer" containerID="e88c90367f7599ac382291baac95a475e9f7f579d4283380c069d22ac74cf0e6" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.653554 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654602 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="extract-utilities" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654619 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="extract-utilities" Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654636 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="extract-utilities" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654644 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="extract-utilities" Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654670 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654679 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654692 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654700 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654723 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="extract-content" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654732 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="extract-content" Feb 18 06:10:43 crc kubenswrapper[4869]: E0218 06:10:43.654830 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="extract-content" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.654839 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="extract-content" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.655092 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e1e29c-fbd6-41f5-870f-b690738aaa0b" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.655104 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="00660609-49d3-4cf5-a722-35739a5493e8" containerName="registry-server" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.656730 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.668104 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.733094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.733153 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx7tg\" (UniqueName: \"kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.733230 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.835652 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.835961 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx7tg\" (UniqueName: \"kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.836110 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.836271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.836525 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:43 crc kubenswrapper[4869]: I0218 06:10:43.856366 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx7tg\" (UniqueName: \"kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg\") pod \"community-operators-pbf6w\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:44 crc kubenswrapper[4869]: I0218 06:10:44.009423 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:44 crc kubenswrapper[4869]: I0218 06:10:44.532905 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:44 crc kubenswrapper[4869]: I0218 06:10:44.953594 4869 generic.go:334] "Generic (PLEG): container finished" podID="b4f66087-7138-4398-b95f-f9115bb32120" containerID="5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b" exitCode=0 Feb 18 06:10:44 crc kubenswrapper[4869]: I0218 06:10:44.953679 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerDied","Data":"5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b"} Feb 18 06:10:44 crc kubenswrapper[4869]: I0218 06:10:44.953947 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerStarted","Data":"b92464602c1c8494169da01e63c778ce23b566fc8bc1e4b7beeecfabd048d2ee"} Feb 18 06:10:45 crc kubenswrapper[4869]: I0218 06:10:45.964571 4869 generic.go:334] "Generic (PLEG): container finished" podID="b4f66087-7138-4398-b95f-f9115bb32120" containerID="cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678" exitCode=0 Feb 18 06:10:45 crc kubenswrapper[4869]: I0218 06:10:45.964647 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerDied","Data":"cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678"} Feb 18 06:10:46 crc kubenswrapper[4869]: I0218 06:10:46.976969 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerStarted","Data":"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d"} Feb 18 06:10:47 crc kubenswrapper[4869]: I0218 06:10:47.001357 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pbf6w" podStartSLOduration=2.55047005 podStartE2EDuration="4.001338388s" podCreationTimestamp="2026-02-18 06:10:43 +0000 UTC" firstStartedPulling="2026-02-18 06:10:44.955472275 +0000 UTC m=+1342.124560507" lastFinishedPulling="2026-02-18 06:10:46.406340613 +0000 UTC m=+1343.575428845" observedRunningTime="2026-02-18 06:10:46.993228825 +0000 UTC m=+1344.162317047" watchObservedRunningTime="2026-02-18 06:10:47.001338388 +0000 UTC m=+1344.170426620" Feb 18 06:10:49 crc kubenswrapper[4869]: E0218 06:10:49.435610 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache]" Feb 18 06:10:54 crc kubenswrapper[4869]: I0218 06:10:54.010468 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:54 crc kubenswrapper[4869]: I0218 06:10:54.011235 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:54 crc kubenswrapper[4869]: I0218 06:10:54.063139 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:54 crc kubenswrapper[4869]: I0218 06:10:54.113507 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:54 crc kubenswrapper[4869]: I0218 06:10:54.306523 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:56 crc kubenswrapper[4869]: I0218 06:10:56.061300 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pbf6w" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="registry-server" containerID="cri-o://9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d" gracePeriod=2 Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.012039 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.073402 4869 generic.go:334] "Generic (PLEG): container finished" podID="b4f66087-7138-4398-b95f-f9115bb32120" containerID="9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d" exitCode=0 Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.073446 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerDied","Data":"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d"} Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.073471 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbf6w" event={"ID":"b4f66087-7138-4398-b95f-f9115bb32120","Type":"ContainerDied","Data":"b92464602c1c8494169da01e63c778ce23b566fc8bc1e4b7beeecfabd048d2ee"} Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.073489 4869 scope.go:117] "RemoveContainer" containerID="9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.073544 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbf6w" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.098183 4869 scope.go:117] "RemoveContainer" containerID="cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.113211 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content\") pod \"b4f66087-7138-4398-b95f-f9115bb32120\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.113364 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx7tg\" (UniqueName: \"kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg\") pod \"b4f66087-7138-4398-b95f-f9115bb32120\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.113489 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities\") pod \"b4f66087-7138-4398-b95f-f9115bb32120\" (UID: \"b4f66087-7138-4398-b95f-f9115bb32120\") " Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.114739 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities" (OuterVolumeSpecName: "utilities") pod "b4f66087-7138-4398-b95f-f9115bb32120" (UID: "b4f66087-7138-4398-b95f-f9115bb32120"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.121710 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg" (OuterVolumeSpecName: "kube-api-access-bx7tg") pod "b4f66087-7138-4398-b95f-f9115bb32120" (UID: "b4f66087-7138-4398-b95f-f9115bb32120"). InnerVolumeSpecName "kube-api-access-bx7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.127416 4869 scope.go:117] "RemoveContainer" containerID="5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.179586 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4f66087-7138-4398-b95f-f9115bb32120" (UID: "b4f66087-7138-4398-b95f-f9115bb32120"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.212233 4869 scope.go:117] "RemoveContainer" containerID="9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d" Feb 18 06:10:57 crc kubenswrapper[4869]: E0218 06:10:57.212715 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d\": container with ID starting with 9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d not found: ID does not exist" containerID="9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.212787 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d"} err="failed to get container status \"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d\": rpc error: code = NotFound desc = could not find container \"9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d\": container with ID starting with 9015545f813395c11128acf8a002e484b1ba2efd869875ae79437c203649650d not found: ID does not exist" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.212815 4869 scope.go:117] "RemoveContainer" containerID="cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678" Feb 18 06:10:57 crc kubenswrapper[4869]: E0218 06:10:57.213340 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678\": container with ID starting with cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678 not found: ID does not exist" containerID="cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.213370 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678"} err="failed to get container status \"cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678\": rpc error: code = NotFound desc = could not find container \"cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678\": container with ID starting with cce00f544dd214ed8e16dd332756bcf736d1a10e0d9cd358daea2360f745d678 not found: ID does not exist" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.213387 4869 scope.go:117] "RemoveContainer" containerID="5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b" Feb 18 06:10:57 crc kubenswrapper[4869]: E0218 06:10:57.213771 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b\": container with ID starting with 5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b not found: ID does not exist" containerID="5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.213841 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b"} err="failed to get container status \"5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b\": rpc error: code = NotFound desc = could not find container \"5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b\": container with ID starting with 5e3fa20092b5f09b74f9cb269be6b91ec1774c470ad7b5145303401dcb75391b not found: ID does not exist" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.217213 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx7tg\" (UniqueName: \"kubernetes.io/projected/b4f66087-7138-4398-b95f-f9115bb32120-kube-api-access-bx7tg\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.217273 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.217296 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f66087-7138-4398-b95f-f9115bb32120-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.413726 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.421952 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pbf6w"] Feb 18 06:10:57 crc kubenswrapper[4869]: I0218 06:10:57.481021 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f66087-7138-4398-b95f-f9115bb32120" path="/var/lib/kubelet/pods/b4f66087-7138-4398-b95f-f9115bb32120/volumes" Feb 18 06:10:59 crc kubenswrapper[4869]: E0218 06:10:59.673791 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache]" Feb 18 06:11:09 crc kubenswrapper[4869]: E0218 06:11:09.953367 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache]" Feb 18 06:11:20 crc kubenswrapper[4869]: E0218 06:11:20.254924 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00660609_49d3_4cf5_a722_35739a5493e8.slice/crio-a72b89ef27671507b89d18cfe316fcb3a5375a0db011860a6547f8459303f512\": RecentStats: unable to find data in memory cache]" Feb 18 06:11:28 crc kubenswrapper[4869]: I0218 06:11:28.124681 4869 scope.go:117] "RemoveContainer" containerID="58e39ac6f1f159b42d13f6561c2908dd16b19cd0850d79e5527a819c6755a9b7" Feb 18 06:11:28 crc kubenswrapper[4869]: I0218 06:11:28.158082 4869 scope.go:117] "RemoveContainer" containerID="cc0a7c6e939da768bd5956a41cdac243c14d475ab28870517e968bd5b869b21f" Feb 18 06:11:28 crc kubenswrapper[4869]: I0218 06:11:28.228950 4869 scope.go:117] "RemoveContainer" containerID="12560d9c5c3f2fd14f7c7447a545b0f0208f89787c6af42371191217bd5460cf" Feb 18 06:11:28 crc kubenswrapper[4869]: I0218 06:11:28.260124 4869 scope.go:117] "RemoveContainer" containerID="a8b1294c2ad83064d6faed9445497098a3271d80ca3d87ddb2c0a98238751b1c" Feb 18 06:12:28 crc kubenswrapper[4869]: I0218 06:12:28.404593 4869 scope.go:117] "RemoveContainer" containerID="24bd6d819bf8cd5bfda9ebdaec36688778e6e8216b265ef48658d29cfda6cb60" Feb 18 06:12:28 crc kubenswrapper[4869]: I0218 06:12:28.436434 4869 scope.go:117] "RemoveContainer" containerID="2b674ef05b97689ad31861ae1b0ac85b1b20724d20ada4f458e120c9993aef35" Feb 18 06:12:28 crc kubenswrapper[4869]: I0218 06:12:28.459715 4869 scope.go:117] "RemoveContainer" containerID="0610aa1c580eda1201bb0355e4a294c7c1ed25aed49c83ad5fd9185ce11a18b7" Feb 18 06:12:28 crc kubenswrapper[4869]: I0218 06:12:28.481606 4869 scope.go:117] "RemoveContainer" containerID="66d076ecdfaed3c8113dfb58c15b90a792d6bfbbb8f2080fd7a314b410017b9e" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.078683 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:40 crc kubenswrapper[4869]: E0218 06:12:40.079719 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="extract-content" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.079736 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="extract-content" Feb 18 06:12:40 crc kubenswrapper[4869]: E0218 06:12:40.079770 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="registry-server" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.079780 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="registry-server" Feb 18 06:12:40 crc kubenswrapper[4869]: E0218 06:12:40.079803 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="extract-utilities" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.079811 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="extract-utilities" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.080052 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f66087-7138-4398-b95f-f9115bb32120" containerName="registry-server" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.081771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.089035 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.132665 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.132719 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.236196 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.236271 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.236301 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjwz\" (UniqueName: \"kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.338090 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.338393 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.338503 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjwz\" (UniqueName: \"kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.338613 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.338735 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.363902 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjwz\" (UniqueName: \"kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz\") pod \"certified-operators-whfcn\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.409488 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:40 crc kubenswrapper[4869]: I0218 06:12:40.918149 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:41 crc kubenswrapper[4869]: I0218 06:12:41.202412 4869 generic.go:334] "Generic (PLEG): container finished" podID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerID="6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d" exitCode=0 Feb 18 06:12:41 crc kubenswrapper[4869]: I0218 06:12:41.202473 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerDied","Data":"6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d"} Feb 18 06:12:41 crc kubenswrapper[4869]: I0218 06:12:41.202789 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerStarted","Data":"2f03559f43b8d8093996de42214a18ea0433f43c14821a7157f8a9af6f581e2d"} Feb 18 06:12:42 crc kubenswrapper[4869]: I0218 06:12:42.215409 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerStarted","Data":"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f"} Feb 18 06:12:43 crc kubenswrapper[4869]: I0218 06:12:43.225759 4869 generic.go:334] "Generic (PLEG): container finished" podID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerID="bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f" exitCode=0 Feb 18 06:12:43 crc kubenswrapper[4869]: I0218 06:12:43.225831 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerDied","Data":"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f"} Feb 18 06:12:44 crc kubenswrapper[4869]: I0218 06:12:44.235709 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerStarted","Data":"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0"} Feb 18 06:12:44 crc kubenswrapper[4869]: I0218 06:12:44.260704 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whfcn" podStartSLOduration=1.816577669 podStartE2EDuration="4.260685383s" podCreationTimestamp="2026-02-18 06:12:40 +0000 UTC" firstStartedPulling="2026-02-18 06:12:41.204464468 +0000 UTC m=+1458.373552720" lastFinishedPulling="2026-02-18 06:12:43.648572202 +0000 UTC m=+1460.817660434" observedRunningTime="2026-02-18 06:12:44.254055706 +0000 UTC m=+1461.423143938" watchObservedRunningTime="2026-02-18 06:12:44.260685383 +0000 UTC m=+1461.429773615" Feb 18 06:12:50 crc kubenswrapper[4869]: I0218 06:12:50.410506 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:50 crc kubenswrapper[4869]: I0218 06:12:50.411162 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:50 crc kubenswrapper[4869]: I0218 06:12:50.455732 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:51 crc kubenswrapper[4869]: I0218 06:12:51.361876 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:51 crc kubenswrapper[4869]: I0218 06:12:51.430039 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:53 crc kubenswrapper[4869]: I0218 06:12:53.346551 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whfcn" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="registry-server" containerID="cri-o://feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0" gracePeriod=2 Feb 18 06:12:53 crc kubenswrapper[4869]: I0218 06:12:53.905098 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.020297 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content\") pod \"dc415ece-1b11-419f-a3fc-b973f44010ed\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.020576 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities\") pod \"dc415ece-1b11-419f-a3fc-b973f44010ed\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.020972 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjwz\" (UniqueName: \"kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz\") pod \"dc415ece-1b11-419f-a3fc-b973f44010ed\" (UID: \"dc415ece-1b11-419f-a3fc-b973f44010ed\") " Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.021667 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities" (OuterVolumeSpecName: "utilities") pod "dc415ece-1b11-419f-a3fc-b973f44010ed" (UID: "dc415ece-1b11-419f-a3fc-b973f44010ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.022565 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.041050 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz" (OuterVolumeSpecName: "kube-api-access-ktjwz") pod "dc415ece-1b11-419f-a3fc-b973f44010ed" (UID: "dc415ece-1b11-419f-a3fc-b973f44010ed"). InnerVolumeSpecName "kube-api-access-ktjwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.082621 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc415ece-1b11-419f-a3fc-b973f44010ed" (UID: "dc415ece-1b11-419f-a3fc-b973f44010ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.124082 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjwz\" (UniqueName: \"kubernetes.io/projected/dc415ece-1b11-419f-a3fc-b973f44010ed-kube-api-access-ktjwz\") on node \"crc\" DevicePath \"\"" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.124117 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc415ece-1b11-419f-a3fc-b973f44010ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.356604 4869 generic.go:334] "Generic (PLEG): container finished" podID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerID="feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0" exitCode=0 Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.356651 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerDied","Data":"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0"} Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.356683 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfcn" event={"ID":"dc415ece-1b11-419f-a3fc-b973f44010ed","Type":"ContainerDied","Data":"2f03559f43b8d8093996de42214a18ea0433f43c14821a7157f8a9af6f581e2d"} Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.356705 4869 scope.go:117] "RemoveContainer" containerID="feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.357978 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfcn" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.374399 4869 scope.go:117] "RemoveContainer" containerID="bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.402431 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.409828 4869 scope.go:117] "RemoveContainer" containerID="6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.417649 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whfcn"] Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.449311 4869 scope.go:117] "RemoveContainer" containerID="feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0" Feb 18 06:12:54 crc kubenswrapper[4869]: E0218 06:12:54.449686 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0\": container with ID starting with feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0 not found: ID does not exist" containerID="feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.449754 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0"} err="failed to get container status \"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0\": rpc error: code = NotFound desc = could not find container \"feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0\": container with ID starting with feca46039dbb16355c58f4ace1b3813ec58c50bdac3c25f659152d3c5fae4ba0 not found: ID does not exist" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.449785 4869 scope.go:117] "RemoveContainer" containerID="bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f" Feb 18 06:12:54 crc kubenswrapper[4869]: E0218 06:12:54.450014 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f\": container with ID starting with bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f not found: ID does not exist" containerID="bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.450037 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f"} err="failed to get container status \"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f\": rpc error: code = NotFound desc = could not find container \"bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f\": container with ID starting with bdbf2ac9a3479f64d76b338a836fb3e056df1f7180620c362f6db43c7fbef04f not found: ID does not exist" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.450052 4869 scope.go:117] "RemoveContainer" containerID="6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d" Feb 18 06:12:54 crc kubenswrapper[4869]: E0218 06:12:54.450260 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d\": container with ID starting with 6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d not found: ID does not exist" containerID="6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d" Feb 18 06:12:54 crc kubenswrapper[4869]: I0218 06:12:54.450284 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d"} err="failed to get container status \"6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d\": rpc error: code = NotFound desc = could not find container \"6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d\": container with ID starting with 6c3868ca9271e8d21e2f4cb27b41693251c37c9c44ea6ccbbe5a344e93d2461d not found: ID does not exist" Feb 18 06:12:55 crc kubenswrapper[4869]: I0218 06:12:55.484234 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" path="/var/lib/kubelet/pods/dc415ece-1b11-419f-a3fc-b973f44010ed/volumes" Feb 18 06:13:09 crc kubenswrapper[4869]: I0218 06:13:09.528870 4869 generic.go:334] "Generic (PLEG): container finished" podID="26716094-10bf-4523-9c23-674dd4b7d517" containerID="c1307627db66327ffb9c8f6caed63da3b49f9b44d2ce0c1bb5c60f839e8bb1a8" exitCode=0 Feb 18 06:13:09 crc kubenswrapper[4869]: I0218 06:13:09.529029 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" event={"ID":"26716094-10bf-4523-9c23-674dd4b7d517","Type":"ContainerDied","Data":"c1307627db66327ffb9c8f6caed63da3b49f9b44d2ce0c1bb5c60f839e8bb1a8"} Feb 18 06:13:10 crc kubenswrapper[4869]: I0218 06:13:10.132544 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:13:10 crc kubenswrapper[4869]: I0218 06:13:10.132607 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.076588 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.244602 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle\") pod \"26716094-10bf-4523-9c23-674dd4b7d517\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.245002 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam\") pod \"26716094-10bf-4523-9c23-674dd4b7d517\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.245112 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zd7\" (UniqueName: \"kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7\") pod \"26716094-10bf-4523-9c23-674dd4b7d517\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.245258 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory\") pod \"26716094-10bf-4523-9c23-674dd4b7d517\" (UID: \"26716094-10bf-4523-9c23-674dd4b7d517\") " Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.252163 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "26716094-10bf-4523-9c23-674dd4b7d517" (UID: "26716094-10bf-4523-9c23-674dd4b7d517"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.252411 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7" (OuterVolumeSpecName: "kube-api-access-s7zd7") pod "26716094-10bf-4523-9c23-674dd4b7d517" (UID: "26716094-10bf-4523-9c23-674dd4b7d517"). InnerVolumeSpecName "kube-api-access-s7zd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.276165 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory" (OuterVolumeSpecName: "inventory") pod "26716094-10bf-4523-9c23-674dd4b7d517" (UID: "26716094-10bf-4523-9c23-674dd4b7d517"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.276604 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "26716094-10bf-4523-9c23-674dd4b7d517" (UID: "26716094-10bf-4523-9c23-674dd4b7d517"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.347818 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zd7\" (UniqueName: \"kubernetes.io/projected/26716094-10bf-4523-9c23-674dd4b7d517-kube-api-access-s7zd7\") on node \"crc\" DevicePath \"\"" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.347872 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.347892 4869 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:13:11 crc kubenswrapper[4869]: I0218 06:13:11.347907 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26716094-10bf-4523-9c23-674dd4b7d517-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.186866 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn"] Feb 18 06:13:12 crc kubenswrapper[4869]: E0218 06:13:12.187233 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="extract-content" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187249 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="extract-content" Feb 18 06:13:12 crc kubenswrapper[4869]: E0218 06:13:12.187274 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="extract-utilities" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187282 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="extract-utilities" Feb 18 06:13:12 crc kubenswrapper[4869]: E0218 06:13:12.187300 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26716094-10bf-4523-9c23-674dd4b7d517" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187309 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="26716094-10bf-4523-9c23-674dd4b7d517" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:13:12 crc kubenswrapper[4869]: E0218 06:13:12.187341 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="registry-server" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187350 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="registry-server" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187567 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc415ece-1b11-419f-a3fc-b973f44010ed" containerName="registry-server" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.187588 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="26716094-10bf-4523-9c23-674dd4b7d517" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.188252 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn"] Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.188349 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.191423 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" event={"ID":"26716094-10bf-4523-9c23-674dd4b7d517","Type":"ContainerDied","Data":"3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d"} Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.191491 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3d008d40e8385ad233675d7a2c26d3eb1fba84083e8734c3455408341cb45d" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.191977 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.265917 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.266033 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4ck\" (UniqueName: \"kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.266062 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.367358 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4ck\" (UniqueName: \"kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.367415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.367479 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.373844 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.385398 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.389470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4ck\" (UniqueName: \"kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:12 crc kubenswrapper[4869]: I0218 06:13:12.533213 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:13:13 crc kubenswrapper[4869]: I0218 06:13:13.064574 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn"] Feb 18 06:13:13 crc kubenswrapper[4869]: W0218 06:13:13.076512 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod102527af_43b3_4260_bdbf_cd653b203986.slice/crio-54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309 WatchSource:0}: Error finding container 54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309: Status 404 returned error can't find the container with id 54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309 Feb 18 06:13:13 crc kubenswrapper[4869]: I0218 06:13:13.079783 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:13:13 crc kubenswrapper[4869]: I0218 06:13:13.211269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" event={"ID":"102527af-43b3-4260-bdbf-cd653b203986","Type":"ContainerStarted","Data":"54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309"} Feb 18 06:13:14 crc kubenswrapper[4869]: I0218 06:13:14.221240 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" event={"ID":"102527af-43b3-4260-bdbf-cd653b203986","Type":"ContainerStarted","Data":"02509f30e4fb5eacdc90a5f477f0bde08d8896a20e62786630f3e5c50c961a68"} Feb 18 06:13:14 crc kubenswrapper[4869]: I0218 06:13:14.249528 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" podStartSLOduration=2.7693149249999998 podStartE2EDuration="3.249505673s" podCreationTimestamp="2026-02-18 06:13:11 +0000 UTC" firstStartedPulling="2026-02-18 06:13:13.079329834 +0000 UTC m=+1490.248418096" lastFinishedPulling="2026-02-18 06:13:13.559520602 +0000 UTC m=+1490.728608844" observedRunningTime="2026-02-18 06:13:14.239793222 +0000 UTC m=+1491.408881464" watchObservedRunningTime="2026-02-18 06:13:14.249505673 +0000 UTC m=+1491.418593905" Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.132724 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.133452 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.133521 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.134633 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.134728 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" gracePeriod=600 Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.475665 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" exitCode=0 Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.475736 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7"} Feb 18 06:13:40 crc kubenswrapper[4869]: I0218 06:13:40.475924 4869 scope.go:117] "RemoveContainer" containerID="37492f897717923690caff194814f180694044bf504a2fcda1d5391e8ea76923" Feb 18 06:13:40 crc kubenswrapper[4869]: E0218 06:13:40.780471 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:13:41 crc kubenswrapper[4869]: I0218 06:13:41.493909 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:13:41 crc kubenswrapper[4869]: E0218 06:13:41.494672 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:13:52 crc kubenswrapper[4869]: I0218 06:13:52.469784 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:13:52 crc kubenswrapper[4869]: E0218 06:13:52.470460 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:14:06 crc kubenswrapper[4869]: I0218 06:14:06.471158 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:14:06 crc kubenswrapper[4869]: E0218 06:14:06.472629 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:14:17 crc kubenswrapper[4869]: I0218 06:14:17.471262 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:14:17 crc kubenswrapper[4869]: E0218 06:14:17.472318 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:14:32 crc kubenswrapper[4869]: I0218 06:14:32.470856 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:14:32 crc kubenswrapper[4869]: E0218 06:14:32.473112 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:14:34 crc kubenswrapper[4869]: I0218 06:14:34.043859 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vsntf"] Feb 18 06:14:34 crc kubenswrapper[4869]: I0218 06:14:34.056866 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sr5s4"] Feb 18 06:14:34 crc kubenswrapper[4869]: I0218 06:14:34.069442 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sr5s4"] Feb 18 06:14:34 crc kubenswrapper[4869]: I0218 06:14:34.081490 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vsntf"] Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.038259 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9a85-account-create-update-tbcd2"] Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.048664 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2d5b-account-create-update-7v5gt"] Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.059507 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9a85-account-create-update-tbcd2"] Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.071367 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2d5b-account-create-update-7v5gt"] Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.480312 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adc4451-3322-4686-aeb5-ea4a457724a4" path="/var/lib/kubelet/pods/1adc4451-3322-4686-aeb5-ea4a457724a4/volumes" Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.481221 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ae3a7e-91a4-4a0f-9537-d0438699a82f" path="/var/lib/kubelet/pods/a2ae3a7e-91a4-4a0f-9537-d0438699a82f/volumes" Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.482009 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6" path="/var/lib/kubelet/pods/d7b2b37a-5aa5-408a-afe8-6bca0b1aa9e6/volumes" Feb 18 06:14:35 crc kubenswrapper[4869]: I0218 06:14:35.482677 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bd308c-0bb8-49f6-a629-8d24b1c5d55e" path="/var/lib/kubelet/pods/f8bd308c-0bb8-49f6-a629-8d24b1c5d55e/volumes" Feb 18 06:14:36 crc kubenswrapper[4869]: I0218 06:14:36.024565 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jhdcd"] Feb 18 06:14:36 crc kubenswrapper[4869]: I0218 06:14:36.039059 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-143a-account-create-update-n6nb8"] Feb 18 06:14:36 crc kubenswrapper[4869]: I0218 06:14:36.047298 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jhdcd"] Feb 18 06:14:36 crc kubenswrapper[4869]: I0218 06:14:36.056376 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-143a-account-create-update-n6nb8"] Feb 18 06:14:37 crc kubenswrapper[4869]: I0218 06:14:37.480298 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2268e717-5605-4efd-aed0-0323235e9211" path="/var/lib/kubelet/pods/2268e717-5605-4efd-aed0-0323235e9211/volumes" Feb 18 06:14:37 crc kubenswrapper[4869]: I0218 06:14:37.481053 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf77221-4fa8-4a41-b019-042eb77a2553" path="/var/lib/kubelet/pods/aaf77221-4fa8-4a41-b019-042eb77a2553/volumes" Feb 18 06:14:43 crc kubenswrapper[4869]: I0218 06:14:43.264869 4869 generic.go:334] "Generic (PLEG): container finished" podID="102527af-43b3-4260-bdbf-cd653b203986" containerID="02509f30e4fb5eacdc90a5f477f0bde08d8896a20e62786630f3e5c50c961a68" exitCode=0 Feb 18 06:14:43 crc kubenswrapper[4869]: I0218 06:14:43.264960 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" event={"ID":"102527af-43b3-4260-bdbf-cd653b203986","Type":"ContainerDied","Data":"02509f30e4fb5eacdc90a5f477f0bde08d8896a20e62786630f3e5c50c961a68"} Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.662422 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.779818 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory\") pod \"102527af-43b3-4260-bdbf-cd653b203986\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.779998 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam\") pod \"102527af-43b3-4260-bdbf-cd653b203986\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.780060 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl4ck\" (UniqueName: \"kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck\") pod \"102527af-43b3-4260-bdbf-cd653b203986\" (UID: \"102527af-43b3-4260-bdbf-cd653b203986\") " Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.787634 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck" (OuterVolumeSpecName: "kube-api-access-sl4ck") pod "102527af-43b3-4260-bdbf-cd653b203986" (UID: "102527af-43b3-4260-bdbf-cd653b203986"). InnerVolumeSpecName "kube-api-access-sl4ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.812500 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "102527af-43b3-4260-bdbf-cd653b203986" (UID: "102527af-43b3-4260-bdbf-cd653b203986"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.828273 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory" (OuterVolumeSpecName: "inventory") pod "102527af-43b3-4260-bdbf-cd653b203986" (UID: "102527af-43b3-4260-bdbf-cd653b203986"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.882316 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.882348 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl4ck\" (UniqueName: \"kubernetes.io/projected/102527af-43b3-4260-bdbf-cd653b203986-kube-api-access-sl4ck\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:44 crc kubenswrapper[4869]: I0218 06:14:44.882358 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102527af-43b3-4260-bdbf-cd653b203986-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.287141 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" event={"ID":"102527af-43b3-4260-bdbf-cd653b203986","Type":"ContainerDied","Data":"54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309"} Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.287193 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b52406a58f3aa240c73b25adbfe6f3cd98944eaa168f97554c1f8c0a6d8309" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.287250 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.377241 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn"] Feb 18 06:14:45 crc kubenswrapper[4869]: E0218 06:14:45.377616 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102527af-43b3-4260-bdbf-cd653b203986" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.377633 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="102527af-43b3-4260-bdbf-cd653b203986" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.377869 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="102527af-43b3-4260-bdbf-cd653b203986" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.378424 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.381836 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.381869 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.381966 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.381990 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.390375 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn"] Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.470401 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:14:45 crc kubenswrapper[4869]: E0218 06:14:45.470661 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.492189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.492392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twphb\" (UniqueName: \"kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.492442 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.594155 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twphb\" (UniqueName: \"kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.594243 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.594279 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.598861 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.599052 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.610015 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twphb\" (UniqueName: \"kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:45 crc kubenswrapper[4869]: I0218 06:14:45.706484 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:14:46 crc kubenswrapper[4869]: I0218 06:14:46.242871 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn"] Feb 18 06:14:46 crc kubenswrapper[4869]: I0218 06:14:46.295303 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" event={"ID":"1475668a-1132-4548-a5e6-0f4a459480c1","Type":"ContainerStarted","Data":"3d6602b88971c02d989f730f37fb398e0ef2a98894d48e00166d99db94d7cd1d"} Feb 18 06:14:47 crc kubenswrapper[4869]: I0218 06:14:47.306160 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" event={"ID":"1475668a-1132-4548-a5e6-0f4a459480c1","Type":"ContainerStarted","Data":"5ccf0fe1af636d4aecc79a0704259f418beb528710b39c618a518d868c7754d1"} Feb 18 06:14:47 crc kubenswrapper[4869]: I0218 06:14:47.351513 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" podStartSLOduration=1.945410834 podStartE2EDuration="2.351485986s" podCreationTimestamp="2026-02-18 06:14:45 +0000 UTC" firstStartedPulling="2026-02-18 06:14:46.24491784 +0000 UTC m=+1583.414006072" lastFinishedPulling="2026-02-18 06:14:46.650992982 +0000 UTC m=+1583.820081224" observedRunningTime="2026-02-18 06:14:47.319585532 +0000 UTC m=+1584.488673804" watchObservedRunningTime="2026-02-18 06:14:47.351485986 +0000 UTC m=+1584.520574228" Feb 18 06:14:57 crc kubenswrapper[4869]: I0218 06:14:57.470270 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:14:57 crc kubenswrapper[4869]: E0218 06:14:57.471242 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.164386 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5"] Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.167141 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.171548 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.171630 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.186297 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5"] Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.326783 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.326945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xsn\" (UniqueName: \"kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.327005 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.428903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.429089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.429161 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42xsn\" (UniqueName: \"kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.429920 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.435254 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.445090 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xsn\" (UniqueName: \"kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn\") pod \"collect-profiles-29523255-mgfw5\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.495860 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:00 crc kubenswrapper[4869]: I0218 06:15:00.935320 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5"] Feb 18 06:15:01 crc kubenswrapper[4869]: I0218 06:15:01.430699 4869 generic.go:334] "Generic (PLEG): container finished" podID="07cfdf2a-a3c6-4a6c-943b-86814cd86513" containerID="4c9c516286e0d854a30a1a75f748c6aa09c4f6066e9f680621c4b6fe62ba051c" exitCode=0 Feb 18 06:15:01 crc kubenswrapper[4869]: I0218 06:15:01.430784 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" event={"ID":"07cfdf2a-a3c6-4a6c-943b-86814cd86513","Type":"ContainerDied","Data":"4c9c516286e0d854a30a1a75f748c6aa09c4f6066e9f680621c4b6fe62ba051c"} Feb 18 06:15:01 crc kubenswrapper[4869]: I0218 06:15:01.431058 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" event={"ID":"07cfdf2a-a3c6-4a6c-943b-86814cd86513","Type":"ContainerStarted","Data":"bd0ef5ab5367959dfb4fbc9c2163ce58796b83eb03f18a5dac858f5631f6f653"} Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.754643 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.874949 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume\") pod \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.875014 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume\") pod \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.875050 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42xsn\" (UniqueName: \"kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn\") pod \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\" (UID: \"07cfdf2a-a3c6-4a6c-943b-86814cd86513\") " Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.875722 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume" (OuterVolumeSpecName: "config-volume") pod "07cfdf2a-a3c6-4a6c-943b-86814cd86513" (UID: "07cfdf2a-a3c6-4a6c-943b-86814cd86513"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.875884 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07cfdf2a-a3c6-4a6c-943b-86814cd86513-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.880954 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn" (OuterVolumeSpecName: "kube-api-access-42xsn") pod "07cfdf2a-a3c6-4a6c-943b-86814cd86513" (UID: "07cfdf2a-a3c6-4a6c-943b-86814cd86513"). InnerVolumeSpecName "kube-api-access-42xsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.882125 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07cfdf2a-a3c6-4a6c-943b-86814cd86513" (UID: "07cfdf2a-a3c6-4a6c-943b-86814cd86513"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.978010 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07cfdf2a-a3c6-4a6c-943b-86814cd86513-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:02 crc kubenswrapper[4869]: I0218 06:15:02.978315 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42xsn\" (UniqueName: \"kubernetes.io/projected/07cfdf2a-a3c6-4a6c-943b-86814cd86513-kube-api-access-42xsn\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:03 crc kubenswrapper[4869]: I0218 06:15:03.449272 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" event={"ID":"07cfdf2a-a3c6-4a6c-943b-86814cd86513","Type":"ContainerDied","Data":"bd0ef5ab5367959dfb4fbc9c2163ce58796b83eb03f18a5dac858f5631f6f653"} Feb 18 06:15:03 crc kubenswrapper[4869]: I0218 06:15:03.449314 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0ef5ab5367959dfb4fbc9c2163ce58796b83eb03f18a5dac858f5631f6f653" Feb 18 06:15:03 crc kubenswrapper[4869]: I0218 06:15:03.449330 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-mgfw5" Feb 18 06:15:05 crc kubenswrapper[4869]: I0218 06:15:05.040258 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-g2c4c"] Feb 18 06:15:05 crc kubenswrapper[4869]: I0218 06:15:05.049081 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-g2c4c"] Feb 18 06:15:05 crc kubenswrapper[4869]: I0218 06:15:05.480767 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aed2e80-e13f-49c6-8570-f6384bc1a079" path="/var/lib/kubelet/pods/7aed2e80-e13f-49c6-8570-f6384bc1a079/volumes" Feb 18 06:15:08 crc kubenswrapper[4869]: I0218 06:15:08.027389 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n69mt"] Feb 18 06:15:08 crc kubenswrapper[4869]: I0218 06:15:08.036243 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n69mt"] Feb 18 06:15:09 crc kubenswrapper[4869]: I0218 06:15:09.471108 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:15:09 crc kubenswrapper[4869]: E0218 06:15:09.471854 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:15:09 crc kubenswrapper[4869]: I0218 06:15:09.488815 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfb4ffc-626d-4998-8c3e-422f3f41ac00" path="/var/lib/kubelet/pods/8dfb4ffc-626d-4998-8c3e-422f3f41ac00/volumes" Feb 18 06:15:17 crc kubenswrapper[4869]: I0218 06:15:17.030624 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dvmlc"] Feb 18 06:15:17 crc kubenswrapper[4869]: I0218 06:15:17.039857 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dvmlc"] Feb 18 06:15:17 crc kubenswrapper[4869]: I0218 06:15:17.480589 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05aa0f74-aab3-44e4-805b-4d4df0c86c5b" path="/var/lib/kubelet/pods/05aa0f74-aab3-44e4-805b-4d4df0c86c5b/volumes" Feb 18 06:15:18 crc kubenswrapper[4869]: I0218 06:15:18.030912 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e36c-account-create-update-49m75"] Feb 18 06:15:18 crc kubenswrapper[4869]: I0218 06:15:18.045013 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e36c-account-create-update-49m75"] Feb 18 06:15:19 crc kubenswrapper[4869]: I0218 06:15:19.481382 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544d6a86-e6d4-47b4-916e-2dfbe467b5f6" path="/var/lib/kubelet/pods/544d6a86-e6d4-47b4-916e-2dfbe467b5f6/volumes" Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.034595 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rskmj"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.044671 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-50c2-account-create-update-fv8ng"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.055548 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bede-account-create-update-tsvv4"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.068396 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rskmj"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.079283 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5qhrw"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.088049 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bede-account-create-update-tsvv4"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.095778 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-50c2-account-create-update-fv8ng"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.104519 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5qhrw"] Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.470539 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:15:21 crc kubenswrapper[4869]: E0218 06:15:21.471097 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.482459 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7b2c26-9c52-4095-a3a4-9e97e69d8cda" path="/var/lib/kubelet/pods/5d7b2c26-9c52-4095-a3a4-9e97e69d8cda/volumes" Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.483171 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9c7e54-b859-4ec1-84ea-65b575a5bb54" path="/var/lib/kubelet/pods/8a9c7e54-b859-4ec1-84ea-65b575a5bb54/volumes" Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.483874 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76f6717-a534-466b-8c8b-42bf00e770e8" path="/var/lib/kubelet/pods/c76f6717-a534-466b-8c8b-42bf00e770e8/volumes" Feb 18 06:15:21 crc kubenswrapper[4869]: I0218 06:15:21.484569 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfad6e9-7d65-4a60-9bbd-c6b552167a4e" path="/var/lib/kubelet/pods/ddfad6e9-7d65-4a60-9bbd-c6b552167a4e/volumes" Feb 18 06:15:25 crc kubenswrapper[4869]: I0218 06:15:25.026835 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-94jf2"] Feb 18 06:15:25 crc kubenswrapper[4869]: I0218 06:15:25.036086 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-94jf2"] Feb 18 06:15:25 crc kubenswrapper[4869]: I0218 06:15:25.487828 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173c6fc9-4198-471e-b6f3-a7445d402034" path="/var/lib/kubelet/pods/173c6fc9-4198-471e-b6f3-a7445d402034/volumes" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.599229 4869 scope.go:117] "RemoveContainer" containerID="2f697f4e5ca277b966d46a312c66ae3d1a691cf1bb98ed4d9e4692c1071426d7" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.645841 4869 scope.go:117] "RemoveContainer" containerID="0a785e08508c261bd7ed591066bda750326d3b253997986524cf3efdd49aacd4" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.715245 4869 scope.go:117] "RemoveContainer" containerID="2d62c10291ec70b24ec3577c01b2d046a3eaacd18541f9dc481a46ab7bd80493" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.742936 4869 scope.go:117] "RemoveContainer" containerID="9fcc2fedf44bd0e3860fbb34618e08f8de235f38893936bbf81e8f163ce96ab6" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.799631 4869 scope.go:117] "RemoveContainer" containerID="32055da025d3efbc8424f8602561e6471dc62fb2fd8e676bfbf9903d0a6c8873" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.830958 4869 scope.go:117] "RemoveContainer" containerID="4539cebbd7f59990896a3013e3ad7a8002f56e3a5cfb0b1bcb495b2dc27b1f58" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.874612 4869 scope.go:117] "RemoveContainer" containerID="4a358cdd903eb3bbbe90c423fd331e1493e6ee1d4038d13975c0f8e86d5083c8" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.893630 4869 scope.go:117] "RemoveContainer" containerID="4ea7de77e4efd092d0be0efeb21b1d05d57f4a0c8c797f8026ad3a3a526d4420" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.911420 4869 scope.go:117] "RemoveContainer" containerID="664a9fe9e5f25c24bca12b867ce99a07d03c1cac1b1a472034074022186c981d" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.941010 4869 scope.go:117] "RemoveContainer" containerID="a8ead2ac67d66baf8286361697a3fa7efc5ff6491ae73c99b9ec749774cf5062" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.965563 4869 scope.go:117] "RemoveContainer" containerID="bc01ec2d48ab1e6ee0cbace913062ec9603c1593a443ca5b58e55e727c3f6b06" Feb 18 06:15:28 crc kubenswrapper[4869]: I0218 06:15:28.986350 4869 scope.go:117] "RemoveContainer" containerID="fb67abc41fd2a84058fb464f4447e3c9c87d18f4f134427df5cd99f94ec5fdaf" Feb 18 06:15:29 crc kubenswrapper[4869]: I0218 06:15:29.005258 4869 scope.go:117] "RemoveContainer" containerID="a3781a690a1820eed2cbdaf2377feaf6a61f019fe05b12e09a38e10053f32645" Feb 18 06:15:29 crc kubenswrapper[4869]: I0218 06:15:29.024675 4869 scope.go:117] "RemoveContainer" containerID="6bd55a076bf948119308b606a4b72cd5d4a902ac4fd363c8c44252eb06ca1987" Feb 18 06:15:29 crc kubenswrapper[4869]: I0218 06:15:29.041768 4869 scope.go:117] "RemoveContainer" containerID="12a5b1032db1423dd4da6ba897ccbb8d42330c8210880d1e03c015befbaeb840" Feb 18 06:15:36 crc kubenswrapper[4869]: I0218 06:15:36.470413 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:15:36 crc kubenswrapper[4869]: E0218 06:15:36.471549 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:15:48 crc kubenswrapper[4869]: I0218 06:15:48.470237 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:15:48 crc kubenswrapper[4869]: E0218 06:15:48.471180 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:15:54 crc kubenswrapper[4869]: I0218 06:15:54.053357 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7cn7l"] Feb 18 06:15:54 crc kubenswrapper[4869]: I0218 06:15:54.062645 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7cn7l"] Feb 18 06:15:55 crc kubenswrapper[4869]: I0218 06:15:55.486734 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c414ef60-f94a-4047-ad26-b7ca6fa3f93b" path="/var/lib/kubelet/pods/c414ef60-f94a-4047-ad26-b7ca6fa3f93b/volumes" Feb 18 06:15:55 crc kubenswrapper[4869]: I0218 06:15:55.944462 4869 generic.go:334] "Generic (PLEG): container finished" podID="1475668a-1132-4548-a5e6-0f4a459480c1" containerID="5ccf0fe1af636d4aecc79a0704259f418beb528710b39c618a518d868c7754d1" exitCode=0 Feb 18 06:15:55 crc kubenswrapper[4869]: I0218 06:15:55.944508 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" event={"ID":"1475668a-1132-4548-a5e6-0f4a459480c1","Type":"ContainerDied","Data":"5ccf0fe1af636d4aecc79a0704259f418beb528710b39c618a518d868c7754d1"} Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.321813 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.513733 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory\") pod \"1475668a-1132-4548-a5e6-0f4a459480c1\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.513842 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam\") pod \"1475668a-1132-4548-a5e6-0f4a459480c1\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.514079 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twphb\" (UniqueName: \"kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb\") pod \"1475668a-1132-4548-a5e6-0f4a459480c1\" (UID: \"1475668a-1132-4548-a5e6-0f4a459480c1\") " Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.519632 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb" (OuterVolumeSpecName: "kube-api-access-twphb") pod "1475668a-1132-4548-a5e6-0f4a459480c1" (UID: "1475668a-1132-4548-a5e6-0f4a459480c1"). InnerVolumeSpecName "kube-api-access-twphb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.538868 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1475668a-1132-4548-a5e6-0f4a459480c1" (UID: "1475668a-1132-4548-a5e6-0f4a459480c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.547719 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory" (OuterVolumeSpecName: "inventory") pod "1475668a-1132-4548-a5e6-0f4a459480c1" (UID: "1475668a-1132-4548-a5e6-0f4a459480c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.617392 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.617434 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1475668a-1132-4548-a5e6-0f4a459480c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.617469 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twphb\" (UniqueName: \"kubernetes.io/projected/1475668a-1132-4548-a5e6-0f4a459480c1-kube-api-access-twphb\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.966073 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" event={"ID":"1475668a-1132-4548-a5e6-0f4a459480c1","Type":"ContainerDied","Data":"3d6602b88971c02d989f730f37fb398e0ef2a98894d48e00166d99db94d7cd1d"} Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.966121 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6602b88971c02d989f730f37fb398e0ef2a98894d48e00166d99db94d7cd1d" Feb 18 06:15:57 crc kubenswrapper[4869]: I0218 06:15:57.966163 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.076576 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm"] Feb 18 06:15:58 crc kubenswrapper[4869]: E0218 06:15:58.077196 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1475668a-1132-4548-a5e6-0f4a459480c1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.077224 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1475668a-1132-4548-a5e6-0f4a459480c1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:15:58 crc kubenswrapper[4869]: E0218 06:15:58.077242 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cfdf2a-a3c6-4a6c-943b-86814cd86513" containerName="collect-profiles" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.077253 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cfdf2a-a3c6-4a6c-943b-86814cd86513" containerName="collect-profiles" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.077573 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cfdf2a-a3c6-4a6c-943b-86814cd86513" containerName="collect-profiles" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.077608 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1475668a-1132-4548-a5e6-0f4a459480c1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.078509 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.081352 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.081551 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.081718 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.084959 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.085677 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm"] Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.234824 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.235090 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.235406 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhf9k\" (UniqueName: \"kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.337538 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.337641 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhf9k\" (UniqueName: \"kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.337720 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.342717 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.352359 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.358067 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhf9k\" (UniqueName: \"kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-52cjm\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.410458 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:15:58 crc kubenswrapper[4869]: I0218 06:15:58.959616 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm"] Feb 18 06:15:59 crc kubenswrapper[4869]: I0218 06:15:59.989888 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" event={"ID":"eeb5893c-dc4f-4cb4-b55b-007c03e03889","Type":"ContainerStarted","Data":"7d16eed4875e986f3fc89d7b650707a9b6cf51c9e54afcd432dccac6281880e7"} Feb 18 06:15:59 crc kubenswrapper[4869]: I0218 06:15:59.990678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" event={"ID":"eeb5893c-dc4f-4cb4-b55b-007c03e03889","Type":"ContainerStarted","Data":"750326df581d5939b968e1df32e41d3d018ebb8b8cc4660040746179b9add980"} Feb 18 06:16:00 crc kubenswrapper[4869]: I0218 06:16:00.014126 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" podStartSLOduration=1.557021378 podStartE2EDuration="2.014107779s" podCreationTimestamp="2026-02-18 06:15:58 +0000 UTC" firstStartedPulling="2026-02-18 06:15:58.97218705 +0000 UTC m=+1656.141275282" lastFinishedPulling="2026-02-18 06:15:59.429273461 +0000 UTC m=+1656.598361683" observedRunningTime="2026-02-18 06:16:00.009623572 +0000 UTC m=+1657.178711834" watchObservedRunningTime="2026-02-18 06:16:00.014107779 +0000 UTC m=+1657.183196011" Feb 18 06:16:01 crc kubenswrapper[4869]: I0218 06:16:01.042092 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gk2zt"] Feb 18 06:16:01 crc kubenswrapper[4869]: I0218 06:16:01.052000 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gk2zt"] Feb 18 06:16:01 crc kubenswrapper[4869]: I0218 06:16:01.470156 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:16:01 crc kubenswrapper[4869]: E0218 06:16:01.470853 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:16:01 crc kubenswrapper[4869]: I0218 06:16:01.480494 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b8c093-2eb4-4220-b335-b5b94fb8776e" path="/var/lib/kubelet/pods/50b8c093-2eb4-4220-b335-b5b94fb8776e/volumes" Feb 18 06:16:03 crc kubenswrapper[4869]: I0218 06:16:03.036918 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8wq6q"] Feb 18 06:16:03 crc kubenswrapper[4869]: I0218 06:16:03.057723 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8wq6q"] Feb 18 06:16:03 crc kubenswrapper[4869]: I0218 06:16:03.488495 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c953d279-bde4-4a98-87d5-3cbcaefc875c" path="/var/lib/kubelet/pods/c953d279-bde4-4a98-87d5-3cbcaefc875c/volumes" Feb 18 06:16:05 crc kubenswrapper[4869]: I0218 06:16:05.030351 4869 generic.go:334] "Generic (PLEG): container finished" podID="eeb5893c-dc4f-4cb4-b55b-007c03e03889" containerID="7d16eed4875e986f3fc89d7b650707a9b6cf51c9e54afcd432dccac6281880e7" exitCode=0 Feb 18 06:16:05 crc kubenswrapper[4869]: I0218 06:16:05.030396 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" event={"ID":"eeb5893c-dc4f-4cb4-b55b-007c03e03889","Type":"ContainerDied","Data":"7d16eed4875e986f3fc89d7b650707a9b6cf51c9e54afcd432dccac6281880e7"} Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.459233 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.615193 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhf9k\" (UniqueName: \"kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k\") pod \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.615435 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory\") pod \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.615498 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam\") pod \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\" (UID: \"eeb5893c-dc4f-4cb4-b55b-007c03e03889\") " Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.630882 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k" (OuterVolumeSpecName: "kube-api-access-vhf9k") pod "eeb5893c-dc4f-4cb4-b55b-007c03e03889" (UID: "eeb5893c-dc4f-4cb4-b55b-007c03e03889"). InnerVolumeSpecName "kube-api-access-vhf9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.641115 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory" (OuterVolumeSpecName: "inventory") pod "eeb5893c-dc4f-4cb4-b55b-007c03e03889" (UID: "eeb5893c-dc4f-4cb4-b55b-007c03e03889"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.642348 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eeb5893c-dc4f-4cb4-b55b-007c03e03889" (UID: "eeb5893c-dc4f-4cb4-b55b-007c03e03889"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.719162 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.719195 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eeb5893c-dc4f-4cb4-b55b-007c03e03889-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:06 crc kubenswrapper[4869]: I0218 06:16:06.719206 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhf9k\" (UniqueName: \"kubernetes.io/projected/eeb5893c-dc4f-4cb4-b55b-007c03e03889-kube-api-access-vhf9k\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.050391 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" event={"ID":"eeb5893c-dc4f-4cb4-b55b-007c03e03889","Type":"ContainerDied","Data":"750326df581d5939b968e1df32e41d3d018ebb8b8cc4660040746179b9add980"} Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.050818 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750326df581d5939b968e1df32e41d3d018ebb8b8cc4660040746179b9add980" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.050465 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-52cjm" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.123891 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr"] Feb 18 06:16:07 crc kubenswrapper[4869]: E0218 06:16:07.124376 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb5893c-dc4f-4cb4-b55b-007c03e03889" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.124401 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb5893c-dc4f-4cb4-b55b-007c03e03889" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.124633 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb5893c-dc4f-4cb4-b55b-007c03e03889" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.125284 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.127362 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.127850 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.128089 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.128131 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.136265 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr"] Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.229367 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k4c\" (UniqueName: \"kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.229421 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.229553 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.331579 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2k4c\" (UniqueName: \"kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.331692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.332026 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.337461 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.337565 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.351729 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2k4c\" (UniqueName: \"kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cctjr\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.452196 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:07 crc kubenswrapper[4869]: I0218 06:16:07.842146 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr"] Feb 18 06:16:08 crc kubenswrapper[4869]: I0218 06:16:08.058935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" event={"ID":"c1b93caa-11f6-4841-b63c-6542711f26cc","Type":"ContainerStarted","Data":"2d324138985bb30a6e4746dda524b01943e8e18483b1e1b30bf1218c358a818e"} Feb 18 06:16:09 crc kubenswrapper[4869]: I0218 06:16:09.070511 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" event={"ID":"c1b93caa-11f6-4841-b63c-6542711f26cc","Type":"ContainerStarted","Data":"4d2a459f15670e8b2bd4ed18f49d01b71fbd367dff095727d74b07c4f0bd194b"} Feb 18 06:16:14 crc kubenswrapper[4869]: I0218 06:16:14.470028 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:16:14 crc kubenswrapper[4869]: E0218 06:16:14.470797 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:16:16 crc kubenswrapper[4869]: I0218 06:16:16.063648 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" podStartSLOduration=8.692066247 podStartE2EDuration="9.063623233s" podCreationTimestamp="2026-02-18 06:16:07 +0000 UTC" firstStartedPulling="2026-02-18 06:16:07.849689451 +0000 UTC m=+1665.018777683" lastFinishedPulling="2026-02-18 06:16:08.221246417 +0000 UTC m=+1665.390334669" observedRunningTime="2026-02-18 06:16:09.090054126 +0000 UTC m=+1666.259142368" watchObservedRunningTime="2026-02-18 06:16:16.063623233 +0000 UTC m=+1673.232711465" Feb 18 06:16:16 crc kubenswrapper[4869]: I0218 06:16:16.065649 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zzcxf"] Feb 18 06:16:16 crc kubenswrapper[4869]: I0218 06:16:16.073178 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zzcxf"] Feb 18 06:16:17 crc kubenswrapper[4869]: I0218 06:16:17.483161 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdffd9a-f626-4bf2-b1e0-104eca55e7f5" path="/var/lib/kubelet/pods/ffdffd9a-f626-4bf2-b1e0-104eca55e7f5/volumes" Feb 18 06:16:22 crc kubenswrapper[4869]: I0218 06:16:22.029132 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8v5fn"] Feb 18 06:16:22 crc kubenswrapper[4869]: I0218 06:16:22.037150 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8v5fn"] Feb 18 06:16:23 crc kubenswrapper[4869]: I0218 06:16:23.484880 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77d2d3cf-1108-468b-816a-64d29471542e" path="/var/lib/kubelet/pods/77d2d3cf-1108-468b-816a-64d29471542e/volumes" Feb 18 06:16:26 crc kubenswrapper[4869]: I0218 06:16:26.471137 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:16:26 crc kubenswrapper[4869]: E0218 06:16:26.471996 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:16:29 crc kubenswrapper[4869]: I0218 06:16:29.276508 4869 scope.go:117] "RemoveContainer" containerID="5710b416d77ba4c1b05282bf3850529b3d884abdc7dc8b9bfb141a5daf9abf53" Feb 18 06:16:29 crc kubenswrapper[4869]: I0218 06:16:29.324086 4869 scope.go:117] "RemoveContainer" containerID="e7fe756e70be2b967d5c71dcb58200ea0cc65a7f24d645a4c839f706c050a2df" Feb 18 06:16:29 crc kubenswrapper[4869]: I0218 06:16:29.384711 4869 scope.go:117] "RemoveContainer" containerID="daff6c6b03f528b83967bcbb8355a5fc18ffb7d072cccff4c7d7d4e3ba44ca7d" Feb 18 06:16:29 crc kubenswrapper[4869]: I0218 06:16:29.444947 4869 scope.go:117] "RemoveContainer" containerID="accb33c9fcc62c78ffa3fb316288bd7d9ed68be05f9e7fc595f6cbbea44f8ced" Feb 18 06:16:29 crc kubenswrapper[4869]: I0218 06:16:29.475963 4869 scope.go:117] "RemoveContainer" containerID="56db69629c5551f72754a181b5e9e12c994e64a0bd704b08a7755e193c82ade4" Feb 18 06:16:41 crc kubenswrapper[4869]: I0218 06:16:41.470269 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:16:41 crc kubenswrapper[4869]: E0218 06:16:41.471148 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:16:45 crc kubenswrapper[4869]: I0218 06:16:45.546140 4869 generic.go:334] "Generic (PLEG): container finished" podID="c1b93caa-11f6-4841-b63c-6542711f26cc" containerID="4d2a459f15670e8b2bd4ed18f49d01b71fbd367dff095727d74b07c4f0bd194b" exitCode=0 Feb 18 06:16:45 crc kubenswrapper[4869]: I0218 06:16:45.546225 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" event={"ID":"c1b93caa-11f6-4841-b63c-6542711f26cc","Type":"ContainerDied","Data":"4d2a459f15670e8b2bd4ed18f49d01b71fbd367dff095727d74b07c4f0bd194b"} Feb 18 06:16:46 crc kubenswrapper[4869]: I0218 06:16:46.991054 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.175720 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory\") pod \"c1b93caa-11f6-4841-b63c-6542711f26cc\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.176191 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2k4c\" (UniqueName: \"kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c\") pod \"c1b93caa-11f6-4841-b63c-6542711f26cc\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.176305 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam\") pod \"c1b93caa-11f6-4841-b63c-6542711f26cc\" (UID: \"c1b93caa-11f6-4841-b63c-6542711f26cc\") " Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.181671 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c" (OuterVolumeSpecName: "kube-api-access-h2k4c") pod "c1b93caa-11f6-4841-b63c-6542711f26cc" (UID: "c1b93caa-11f6-4841-b63c-6542711f26cc"). InnerVolumeSpecName "kube-api-access-h2k4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.202608 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory" (OuterVolumeSpecName: "inventory") pod "c1b93caa-11f6-4841-b63c-6542711f26cc" (UID: "c1b93caa-11f6-4841-b63c-6542711f26cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.203056 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1b93caa-11f6-4841-b63c-6542711f26cc" (UID: "c1b93caa-11f6-4841-b63c-6542711f26cc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.279094 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2k4c\" (UniqueName: \"kubernetes.io/projected/c1b93caa-11f6-4841-b63c-6542711f26cc-kube-api-access-h2k4c\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.279137 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.279149 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1b93caa-11f6-4841-b63c-6542711f26cc-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.561135 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" event={"ID":"c1b93caa-11f6-4841-b63c-6542711f26cc","Type":"ContainerDied","Data":"2d324138985bb30a6e4746dda524b01943e8e18483b1e1b30bf1218c358a818e"} Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.561205 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d324138985bb30a6e4746dda524b01943e8e18483b1e1b30bf1218c358a818e" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.561177 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cctjr" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.735269 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f"] Feb 18 06:16:47 crc kubenswrapper[4869]: E0218 06:16:47.735905 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b93caa-11f6-4841-b63c-6542711f26cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.735994 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b93caa-11f6-4841-b63c-6542711f26cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.736230 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b93caa-11f6-4841-b63c-6542711f26cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.736874 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.740676 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.740980 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.741250 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.741302 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.759246 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f"] Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.890049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjn6c\" (UniqueName: \"kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.890147 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.890221 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.992069 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjn6c\" (UniqueName: \"kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.992139 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.992229 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.996858 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:47 crc kubenswrapper[4869]: I0218 06:16:47.997489 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:48 crc kubenswrapper[4869]: I0218 06:16:48.010119 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjn6c\" (UniqueName: \"kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:48 crc kubenswrapper[4869]: I0218 06:16:48.052954 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:16:48 crc kubenswrapper[4869]: I0218 06:16:48.601698 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f"] Feb 18 06:16:48 crc kubenswrapper[4869]: W0218 06:16:48.608073 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ab4b789_eeaf_4e68_b947_436fc6f6bafa.slice/crio-ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e WatchSource:0}: Error finding container ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e: Status 404 returned error can't find the container with id ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e Feb 18 06:16:49 crc kubenswrapper[4869]: I0218 06:16:49.593604 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" event={"ID":"8ab4b789-eeaf-4e68-b947-436fc6f6bafa","Type":"ContainerStarted","Data":"5d41c9fca6041a1210b1b1f5a887a02f8762b2581a5c758fdc936ba431d16820"} Feb 18 06:16:49 crc kubenswrapper[4869]: I0218 06:16:49.593976 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" event={"ID":"8ab4b789-eeaf-4e68-b947-436fc6f6bafa","Type":"ContainerStarted","Data":"ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e"} Feb 18 06:16:49 crc kubenswrapper[4869]: I0218 06:16:49.619544 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" podStartSLOduration=2.165540177 podStartE2EDuration="2.619524113s" podCreationTimestamp="2026-02-18 06:16:47 +0000 UTC" firstStartedPulling="2026-02-18 06:16:48.611179807 +0000 UTC m=+1705.780268039" lastFinishedPulling="2026-02-18 06:16:49.065163743 +0000 UTC m=+1706.234251975" observedRunningTime="2026-02-18 06:16:49.612466137 +0000 UTC m=+1706.781554359" watchObservedRunningTime="2026-02-18 06:16:49.619524113 +0000 UTC m=+1706.788612345" Feb 18 06:16:54 crc kubenswrapper[4869]: I0218 06:16:54.470535 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:16:54 crc kubenswrapper[4869]: E0218 06:16:54.471611 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:16:56 crc kubenswrapper[4869]: I0218 06:16:56.182161 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4965-account-create-update-dgkm7"] Feb 18 06:16:56 crc kubenswrapper[4869]: I0218 06:16:56.263633 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4965-account-create-update-dgkm7"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.039766 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7qzlb"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.048275 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c893-account-create-update-c8z8g"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.080922 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1cde-account-create-update-rzrgg"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.090854 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gc7td"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.101389 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qssnr"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.110493 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c893-account-create-update-c8z8g"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.119097 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1cde-account-create-update-rzrgg"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.128942 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qssnr"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.139512 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gc7td"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.147167 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7qzlb"] Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.480324 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23568676-efc6-4e84-b939-94d530a055c0" path="/var/lib/kubelet/pods/23568676-efc6-4e84-b939-94d530a055c0/volumes" Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.480979 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6" path="/var/lib/kubelet/pods/31bc91ce-a4ea-4a8a-9318-0cb66a7e8fd6/volumes" Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.481514 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436a64f9-ee1e-41cb-9db4-b918bc5c9d71" path="/var/lib/kubelet/pods/436a64f9-ee1e-41cb-9db4-b918bc5c9d71/volumes" Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.482147 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fede98-3d1b-4596-baec-4d975793c9ea" path="/var/lib/kubelet/pods/57fede98-3d1b-4596-baec-4d975793c9ea/volumes" Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.483166 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f987f38-7e61-4316-8935-02a029937c98" path="/var/lib/kubelet/pods/9f987f38-7e61-4316-8935-02a029937c98/volumes" Feb 18 06:16:57 crc kubenswrapper[4869]: I0218 06:16:57.483712 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffaffd72-bfdb-4695-a882-14c5eb87ed33" path="/var/lib/kubelet/pods/ffaffd72-bfdb-4695-a882-14c5eb87ed33/volumes" Feb 18 06:17:09 crc kubenswrapper[4869]: I0218 06:17:09.474780 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:17:09 crc kubenswrapper[4869]: E0218 06:17:09.477481 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:17:23 crc kubenswrapper[4869]: I0218 06:17:23.479111 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:17:23 crc kubenswrapper[4869]: E0218 06:17:23.480135 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:17:24 crc kubenswrapper[4869]: I0218 06:17:24.035245 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77gnw"] Feb 18 06:17:24 crc kubenswrapper[4869]: I0218 06:17:24.045661 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77gnw"] Feb 18 06:17:25 crc kubenswrapper[4869]: I0218 06:17:25.481436 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c122b7-735c-4d51-96f4-7e7db134dece" path="/var/lib/kubelet/pods/e4c122b7-735c-4d51-96f4-7e7db134dece/volumes" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.630858 4869 scope.go:117] "RemoveContainer" containerID="1905007e2f0201bdb04c016b478b487ac60263424eaa13f6c572edffb2a3993c" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.655447 4869 scope.go:117] "RemoveContainer" containerID="c9bd66304325dffb0e3c495168d3a63ceb35429a22f3f0d86ef6228204998c99" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.743653 4869 scope.go:117] "RemoveContainer" containerID="0a12f12b5ca041715d9cab710c1cac7c21e1226fa2c8c37fb622ba9b239851ca" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.779799 4869 scope.go:117] "RemoveContainer" containerID="08d6666d7a42fc771192d0995f5f53cd020e18271debc9ec42ef79d28f489a5f" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.828351 4869 scope.go:117] "RemoveContainer" containerID="12ee1f62901035ef5681355d4239845d7a5bf88512761454c9b3a4b1304e0614" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.904270 4869 scope.go:117] "RemoveContainer" containerID="3ae2a6be93a6c0257b9742278d7f4f254a02195f06fcb53dfe1270d0d250f04e" Feb 18 06:17:29 crc kubenswrapper[4869]: I0218 06:17:29.921283 4869 scope.go:117] "RemoveContainer" containerID="1acbf0f1a84276f5446002adcd8fb78b38cd7b713982ff7f74b233637fc33927" Feb 18 06:17:38 crc kubenswrapper[4869]: I0218 06:17:38.472841 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:17:38 crc kubenswrapper[4869]: E0218 06:17:38.474564 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:17:41 crc kubenswrapper[4869]: I0218 06:17:41.124132 4869 generic.go:334] "Generic (PLEG): container finished" podID="8ab4b789-eeaf-4e68-b947-436fc6f6bafa" containerID="5d41c9fca6041a1210b1b1f5a887a02f8762b2581a5c758fdc936ba431d16820" exitCode=0 Feb 18 06:17:41 crc kubenswrapper[4869]: I0218 06:17:41.124214 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" event={"ID":"8ab4b789-eeaf-4e68-b947-436fc6f6bafa","Type":"ContainerDied","Data":"5d41c9fca6041a1210b1b1f5a887a02f8762b2581a5c758fdc936ba431d16820"} Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.575389 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.764859 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory\") pod \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.764954 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjn6c\" (UniqueName: \"kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c\") pod \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.765091 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam\") pod \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\" (UID: \"8ab4b789-eeaf-4e68-b947-436fc6f6bafa\") " Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.772266 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c" (OuterVolumeSpecName: "kube-api-access-pjn6c") pod "8ab4b789-eeaf-4e68-b947-436fc6f6bafa" (UID: "8ab4b789-eeaf-4e68-b947-436fc6f6bafa"). InnerVolumeSpecName "kube-api-access-pjn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.794886 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory" (OuterVolumeSpecName: "inventory") pod "8ab4b789-eeaf-4e68-b947-436fc6f6bafa" (UID: "8ab4b789-eeaf-4e68-b947-436fc6f6bafa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.800903 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8ab4b789-eeaf-4e68-b947-436fc6f6bafa" (UID: "8ab4b789-eeaf-4e68-b947-436fc6f6bafa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.866842 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.866875 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjn6c\" (UniqueName: \"kubernetes.io/projected/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-kube-api-access-pjn6c\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:42 crc kubenswrapper[4869]: I0218 06:17:42.866886 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ab4b789-eeaf-4e68-b947-436fc6f6bafa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.042840 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-74k4h"] Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.050556 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wbsdh"] Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.058940 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-74k4h"] Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.068053 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wbsdh"] Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.140720 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" event={"ID":"8ab4b789-eeaf-4e68-b947-436fc6f6bafa","Type":"ContainerDied","Data":"ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e"} Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.140787 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2c024e728149a5862f9d99032cf80836b15c948fef9b584a2a0fd98270d94e" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.140844 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.230658 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rbl5"] Feb 18 06:17:43 crc kubenswrapper[4869]: E0218 06:17:43.231202 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab4b789-eeaf-4e68-b947-436fc6f6bafa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.231222 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab4b789-eeaf-4e68-b947-436fc6f6bafa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.231490 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab4b789-eeaf-4e68-b947-436fc6f6bafa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.232329 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.236348 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.236938 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.241477 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rbl5"] Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.241680 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.242082 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.274181 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.274241 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.274406 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgbnb\" (UniqueName: \"kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.376568 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.377196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.377648 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgbnb\" (UniqueName: \"kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.380190 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.380228 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.396335 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgbnb\" (UniqueName: \"kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb\") pod \"ssh-known-hosts-edpm-deployment-5rbl5\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.484881 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc931b7-a618-4dc7-b89e-7c516d699154" path="/var/lib/kubelet/pods/8dc931b7-a618-4dc7-b89e-7c516d699154/volumes" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.485538 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf6dc38-fdba-4044-8f8b-b47d4d03db0a" path="/var/lib/kubelet/pods/aaf6dc38-fdba-4044-8f8b-b47d4d03db0a/volumes" Feb 18 06:17:43 crc kubenswrapper[4869]: I0218 06:17:43.552385 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:44 crc kubenswrapper[4869]: I0218 06:17:44.060190 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rbl5"] Feb 18 06:17:44 crc kubenswrapper[4869]: I0218 06:17:44.151091 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" event={"ID":"76fa3bfe-8da3-4ceb-95c6-de5473957a3e","Type":"ContainerStarted","Data":"9568b113c56ee3de71a9c81bc381b280a7fbe534885b93e3357f75e8e0b32d3a"} Feb 18 06:17:45 crc kubenswrapper[4869]: I0218 06:17:45.162074 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" event={"ID":"76fa3bfe-8da3-4ceb-95c6-de5473957a3e","Type":"ContainerStarted","Data":"24ac1903ad757234500afdec9fa7c029f8157a533b7ff25f3dd64a3a3ce83ba7"} Feb 18 06:17:45 crc kubenswrapper[4869]: I0218 06:17:45.181505 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" podStartSLOduration=1.7824100010000001 podStartE2EDuration="2.181486385s" podCreationTimestamp="2026-02-18 06:17:43 +0000 UTC" firstStartedPulling="2026-02-18 06:17:44.068473044 +0000 UTC m=+1761.237561276" lastFinishedPulling="2026-02-18 06:17:44.467549428 +0000 UTC m=+1761.636637660" observedRunningTime="2026-02-18 06:17:45.175187696 +0000 UTC m=+1762.344275928" watchObservedRunningTime="2026-02-18 06:17:45.181486385 +0000 UTC m=+1762.350574617" Feb 18 06:17:49 crc kubenswrapper[4869]: I0218 06:17:49.470480 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:17:49 crc kubenswrapper[4869]: E0218 06:17:49.471263 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:17:52 crc kubenswrapper[4869]: I0218 06:17:52.218315 4869 generic.go:334] "Generic (PLEG): container finished" podID="76fa3bfe-8da3-4ceb-95c6-de5473957a3e" containerID="24ac1903ad757234500afdec9fa7c029f8157a533b7ff25f3dd64a3a3ce83ba7" exitCode=0 Feb 18 06:17:52 crc kubenswrapper[4869]: I0218 06:17:52.218397 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" event={"ID":"76fa3bfe-8da3-4ceb-95c6-de5473957a3e","Type":"ContainerDied","Data":"24ac1903ad757234500afdec9fa7c029f8157a533b7ff25f3dd64a3a3ce83ba7"} Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.661065 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.777869 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0\") pod \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.778015 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgbnb\" (UniqueName: \"kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb\") pod \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.778266 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam\") pod \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\" (UID: \"76fa3bfe-8da3-4ceb-95c6-de5473957a3e\") " Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.785738 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb" (OuterVolumeSpecName: "kube-api-access-hgbnb") pod "76fa3bfe-8da3-4ceb-95c6-de5473957a3e" (UID: "76fa3bfe-8da3-4ceb-95c6-de5473957a3e"). InnerVolumeSpecName "kube-api-access-hgbnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.804996 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "76fa3bfe-8da3-4ceb-95c6-de5473957a3e" (UID: "76fa3bfe-8da3-4ceb-95c6-de5473957a3e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.805398 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "76fa3bfe-8da3-4ceb-95c6-de5473957a3e" (UID: "76fa3bfe-8da3-4ceb-95c6-de5473957a3e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.880147 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.880178 4869 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:53 crc kubenswrapper[4869]: I0218 06:17:53.880187 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgbnb\" (UniqueName: \"kubernetes.io/projected/76fa3bfe-8da3-4ceb-95c6-de5473957a3e-kube-api-access-hgbnb\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.237762 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" event={"ID":"76fa3bfe-8da3-4ceb-95c6-de5473957a3e","Type":"ContainerDied","Data":"9568b113c56ee3de71a9c81bc381b280a7fbe534885b93e3357f75e8e0b32d3a"} Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.237813 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9568b113c56ee3de71a9c81bc381b280a7fbe534885b93e3357f75e8e0b32d3a" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.237808 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rbl5" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.327543 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn"] Feb 18 06:17:54 crc kubenswrapper[4869]: E0218 06:17:54.328117 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fa3bfe-8da3-4ceb-95c6-de5473957a3e" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.328213 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fa3bfe-8da3-4ceb-95c6-de5473957a3e" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.328431 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fa3bfe-8da3-4ceb-95c6-de5473957a3e" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.329101 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.333406 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.333679 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.334407 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.334602 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.343213 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn"] Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.490659 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.490726 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.490784 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjhg\" (UniqueName: \"kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.592946 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjhg\" (UniqueName: \"kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.593334 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.593401 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.598086 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.603944 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.615353 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjhg\" (UniqueName: \"kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d4jbn\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:54 crc kubenswrapper[4869]: I0218 06:17:54.648908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:17:55 crc kubenswrapper[4869]: I0218 06:17:55.235079 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn"] Feb 18 06:17:56 crc kubenswrapper[4869]: I0218 06:17:56.256652 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" event={"ID":"d4b7d5ea-dca6-4f74-8143-17a7573402d3","Type":"ContainerStarted","Data":"1967eb29c7ec4444a7f1113ddbf4c2e155f39850df159feff26c3a549191b302"} Feb 18 06:17:56 crc kubenswrapper[4869]: I0218 06:17:56.257084 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" event={"ID":"d4b7d5ea-dca6-4f74-8143-17a7573402d3","Type":"ContainerStarted","Data":"2e06917f13461b30b70c16b91b9decedaa7542de536ed6b15439da318b78c2ee"} Feb 18 06:17:56 crc kubenswrapper[4869]: I0218 06:17:56.276803 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" podStartSLOduration=1.789095484 podStartE2EDuration="2.276784044s" podCreationTimestamp="2026-02-18 06:17:54 +0000 UTC" firstStartedPulling="2026-02-18 06:17:55.255236497 +0000 UTC m=+1772.424324729" lastFinishedPulling="2026-02-18 06:17:55.742925057 +0000 UTC m=+1772.912013289" observedRunningTime="2026-02-18 06:17:56.270856425 +0000 UTC m=+1773.439944657" watchObservedRunningTime="2026-02-18 06:17:56.276784044 +0000 UTC m=+1773.445872266" Feb 18 06:18:02 crc kubenswrapper[4869]: I0218 06:18:02.470627 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:18:02 crc kubenswrapper[4869]: E0218 06:18:02.472284 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:18:04 crc kubenswrapper[4869]: I0218 06:18:04.326053 4869 generic.go:334] "Generic (PLEG): container finished" podID="d4b7d5ea-dca6-4f74-8143-17a7573402d3" containerID="1967eb29c7ec4444a7f1113ddbf4c2e155f39850df159feff26c3a549191b302" exitCode=0 Feb 18 06:18:04 crc kubenswrapper[4869]: I0218 06:18:04.326168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" event={"ID":"d4b7d5ea-dca6-4f74-8143-17a7573402d3","Type":"ContainerDied","Data":"1967eb29c7ec4444a7f1113ddbf4c2e155f39850df159feff26c3a549191b302"} Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.774222 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.826898 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam\") pod \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.827038 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjhg\" (UniqueName: \"kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg\") pod \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.827134 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory\") pod \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\" (UID: \"d4b7d5ea-dca6-4f74-8143-17a7573402d3\") " Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.835263 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg" (OuterVolumeSpecName: "kube-api-access-zqjhg") pod "d4b7d5ea-dca6-4f74-8143-17a7573402d3" (UID: "d4b7d5ea-dca6-4f74-8143-17a7573402d3"). InnerVolumeSpecName "kube-api-access-zqjhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.867268 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d4b7d5ea-dca6-4f74-8143-17a7573402d3" (UID: "d4b7d5ea-dca6-4f74-8143-17a7573402d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.891377 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory" (OuterVolumeSpecName: "inventory") pod "d4b7d5ea-dca6-4f74-8143-17a7573402d3" (UID: "d4b7d5ea-dca6-4f74-8143-17a7573402d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.929430 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjhg\" (UniqueName: \"kubernetes.io/projected/d4b7d5ea-dca6-4f74-8143-17a7573402d3-kube-api-access-zqjhg\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.929480 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:05 crc kubenswrapper[4869]: I0218 06:18:05.929490 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4b7d5ea-dca6-4f74-8143-17a7573402d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.344146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" event={"ID":"d4b7d5ea-dca6-4f74-8143-17a7573402d3","Type":"ContainerDied","Data":"2e06917f13461b30b70c16b91b9decedaa7542de536ed6b15439da318b78c2ee"} Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.344479 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e06917f13461b30b70c16b91b9decedaa7542de536ed6b15439da318b78c2ee" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.344601 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d4jbn" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.431362 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz"] Feb 18 06:18:06 crc kubenswrapper[4869]: E0218 06:18:06.431965 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b7d5ea-dca6-4f74-8143-17a7573402d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.431987 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b7d5ea-dca6-4f74-8143-17a7573402d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.432189 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b7d5ea-dca6-4f74-8143-17a7573402d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.433058 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.435167 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.435587 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.435679 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.435863 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.445858 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz"] Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.542989 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9lcg\" (UniqueName: \"kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.543052 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.543169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.645220 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.645675 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9lcg\" (UniqueName: \"kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.645728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.653011 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.654274 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.676718 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9lcg\" (UniqueName: \"kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:06 crc kubenswrapper[4869]: I0218 06:18:06.747195 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:07 crc kubenswrapper[4869]: I0218 06:18:07.271276 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz"] Feb 18 06:18:07 crc kubenswrapper[4869]: W0218 06:18:07.278940 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6656f7_173a_4e9e_b802_6547876438ec.slice/crio-c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5 WatchSource:0}: Error finding container c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5: Status 404 returned error can't find the container with id c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5 Feb 18 06:18:07 crc kubenswrapper[4869]: I0218 06:18:07.356042 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" event={"ID":"4d6656f7-173a-4e9e-b802-6547876438ec","Type":"ContainerStarted","Data":"c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5"} Feb 18 06:18:08 crc kubenswrapper[4869]: I0218 06:18:08.364490 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" event={"ID":"4d6656f7-173a-4e9e-b802-6547876438ec","Type":"ContainerStarted","Data":"7b93a90f5ff8f9f44f83540f270e7baf92bd236a3fca4d021354e1ed99853940"} Feb 18 06:18:08 crc kubenswrapper[4869]: I0218 06:18:08.395364 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" podStartSLOduration=1.870138908 podStartE2EDuration="2.395345422s" podCreationTimestamp="2026-02-18 06:18:06 +0000 UTC" firstStartedPulling="2026-02-18 06:18:07.281259926 +0000 UTC m=+1784.450348158" lastFinishedPulling="2026-02-18 06:18:07.80646644 +0000 UTC m=+1784.975554672" observedRunningTime="2026-02-18 06:18:08.38933697 +0000 UTC m=+1785.558425202" watchObservedRunningTime="2026-02-18 06:18:08.395345422 +0000 UTC m=+1785.564433654" Feb 18 06:18:16 crc kubenswrapper[4869]: I0218 06:18:16.470385 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:18:16 crc kubenswrapper[4869]: E0218 06:18:16.471058 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:18:17 crc kubenswrapper[4869]: I0218 06:18:17.453381 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d6656f7-173a-4e9e-b802-6547876438ec" containerID="7b93a90f5ff8f9f44f83540f270e7baf92bd236a3fca4d021354e1ed99853940" exitCode=0 Feb 18 06:18:17 crc kubenswrapper[4869]: I0218 06:18:17.453503 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" event={"ID":"4d6656f7-173a-4e9e-b802-6547876438ec","Type":"ContainerDied","Data":"7b93a90f5ff8f9f44f83540f270e7baf92bd236a3fca4d021354e1ed99853940"} Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.841807 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.928382 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9lcg\" (UniqueName: \"kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg\") pod \"4d6656f7-173a-4e9e-b802-6547876438ec\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.928448 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam\") pod \"4d6656f7-173a-4e9e-b802-6547876438ec\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.928591 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory\") pod \"4d6656f7-173a-4e9e-b802-6547876438ec\" (UID: \"4d6656f7-173a-4e9e-b802-6547876438ec\") " Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.934354 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg" (OuterVolumeSpecName: "kube-api-access-d9lcg") pod "4d6656f7-173a-4e9e-b802-6547876438ec" (UID: "4d6656f7-173a-4e9e-b802-6547876438ec"). InnerVolumeSpecName "kube-api-access-d9lcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.954656 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory" (OuterVolumeSpecName: "inventory") pod "4d6656f7-173a-4e9e-b802-6547876438ec" (UID: "4d6656f7-173a-4e9e-b802-6547876438ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:18 crc kubenswrapper[4869]: I0218 06:18:18.955544 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4d6656f7-173a-4e9e-b802-6547876438ec" (UID: "4d6656f7-173a-4e9e-b802-6547876438ec"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.030893 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9lcg\" (UniqueName: \"kubernetes.io/projected/4d6656f7-173a-4e9e-b802-6547876438ec-kube-api-access-d9lcg\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.030930 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.030940 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d6656f7-173a-4e9e-b802-6547876438ec-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.471771 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.482234 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz" event={"ID":"4d6656f7-173a-4e9e-b802-6547876438ec","Type":"ContainerDied","Data":"c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5"} Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.482311 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c44d3568440adb85648b3e3b32268a14dcf0cdbf94289c124858a7a2f763f6c5" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.556885 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55"] Feb 18 06:18:19 crc kubenswrapper[4869]: E0218 06:18:19.557542 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6656f7-173a-4e9e-b802-6547876438ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.557566 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6656f7-173a-4e9e-b802-6547876438ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.557928 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6656f7-173a-4e9e-b802-6547876438ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.558731 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.561460 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.561661 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.561670 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.561954 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.561995 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.562109 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.562869 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.564546 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.575029 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55"] Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642622 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642674 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642718 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642752 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642826 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.642966 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643053 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643184 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643237 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhkdq\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643291 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643379 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643490 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643566 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.643629 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745643 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745689 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhkdq\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745761 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745809 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745838 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745861 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745895 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745920 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.745966 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.746857 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.746921 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.746946 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.746990 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.751467 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.751858 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.751919 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.752392 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.752719 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.752787 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.752951 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.753294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.754570 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.754964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.759390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.761470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.762633 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhkdq\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.761665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bdr55\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:19 crc kubenswrapper[4869]: I0218 06:18:19.877681 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:20 crc kubenswrapper[4869]: I0218 06:18:20.420006 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55"] Feb 18 06:18:20 crc kubenswrapper[4869]: I0218 06:18:20.428621 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:18:20 crc kubenswrapper[4869]: I0218 06:18:20.482520 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" event={"ID":"9e02b084-943e-4579-87f9-6a0cdff0d8c1","Type":"ContainerStarted","Data":"65ebc6ecce483246877b4cfb2ac45f58bc105fe39edf95e71203e625e9051887"} Feb 18 06:18:21 crc kubenswrapper[4869]: I0218 06:18:21.495360 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" event={"ID":"9e02b084-943e-4579-87f9-6a0cdff0d8c1","Type":"ContainerStarted","Data":"6681661114c4f00981425c056e54857d2fd7f588da0f633ea04d3a1d40902ffd"} Feb 18 06:18:21 crc kubenswrapper[4869]: I0218 06:18:21.539412 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" podStartSLOduration=2.102323001 podStartE2EDuration="2.53938537s" podCreationTimestamp="2026-02-18 06:18:19 +0000 UTC" firstStartedPulling="2026-02-18 06:18:20.428338236 +0000 UTC m=+1797.597426468" lastFinishedPulling="2026-02-18 06:18:20.865400605 +0000 UTC m=+1798.034488837" observedRunningTime="2026-02-18 06:18:21.524039339 +0000 UTC m=+1798.693127621" watchObservedRunningTime="2026-02-18 06:18:21.53938537 +0000 UTC m=+1798.708473632" Feb 18 06:18:27 crc kubenswrapper[4869]: I0218 06:18:27.049746 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7ctxv"] Feb 18 06:18:27 crc kubenswrapper[4869]: I0218 06:18:27.056912 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7ctxv"] Feb 18 06:18:27 crc kubenswrapper[4869]: I0218 06:18:27.485503 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9913d3a8-cea5-410a-a39e-270de53de317" path="/var/lib/kubelet/pods/9913d3a8-cea5-410a-a39e-270de53de317/volumes" Feb 18 06:18:30 crc kubenswrapper[4869]: I0218 06:18:30.050233 4869 scope.go:117] "RemoveContainer" containerID="422c5e067356db016952362b4780dcd593e8431517871458c0caa57e91ae6645" Feb 18 06:18:30 crc kubenswrapper[4869]: I0218 06:18:30.093573 4869 scope.go:117] "RemoveContainer" containerID="134374408ed805a961975ba78d69790f9ef78101020a32401c5e9979e6211de8" Feb 18 06:18:30 crc kubenswrapper[4869]: I0218 06:18:30.137646 4869 scope.go:117] "RemoveContainer" containerID="baabe3f7136cd63129596c7e3c26aeaa0e47832cc6f1bfe5719b3cd61537ab40" Feb 18 06:18:31 crc kubenswrapper[4869]: I0218 06:18:31.471438 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:18:31 crc kubenswrapper[4869]: E0218 06:18:31.472334 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:18:45 crc kubenswrapper[4869]: I0218 06:18:45.470212 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:18:45 crc kubenswrapper[4869]: I0218 06:18:45.728138 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c"} Feb 18 06:18:56 crc kubenswrapper[4869]: I0218 06:18:56.825273 4869 generic.go:334] "Generic (PLEG): container finished" podID="9e02b084-943e-4579-87f9-6a0cdff0d8c1" containerID="6681661114c4f00981425c056e54857d2fd7f588da0f633ea04d3a1d40902ffd" exitCode=0 Feb 18 06:18:56 crc kubenswrapper[4869]: I0218 06:18:56.825354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" event={"ID":"9e02b084-943e-4579-87f9-6a0cdff0d8c1","Type":"ContainerDied","Data":"6681661114c4f00981425c056e54857d2fd7f588da0f633ea04d3a1d40902ffd"} Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.205439 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310224 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhkdq\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310421 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310485 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310559 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310647 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310719 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310797 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310846 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310893 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310945 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.310978 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.311028 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.312296 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle\") pod \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\" (UID: \"9e02b084-943e-4579-87f9-6a0cdff0d8c1\") " Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.318392 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.318413 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq" (OuterVolumeSpecName: "kube-api-access-lhkdq") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "kube-api-access-lhkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.318488 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.318502 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.319202 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.319549 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.319895 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.322790 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.322985 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.325646 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.325850 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.334500 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.351312 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.351396 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory" (OuterVolumeSpecName: "inventory") pod "9e02b084-943e-4579-87f9-6a0cdff0d8c1" (UID: "9e02b084-943e-4579-87f9-6a0cdff0d8c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414783 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414817 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414830 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414843 4869 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414853 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414863 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414873 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414882 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414891 4869 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414899 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414908 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414916 4869 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414924 4869 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02b084-943e-4579-87f9-6a0cdff0d8c1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.414932 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhkdq\" (UniqueName: \"kubernetes.io/projected/9e02b084-943e-4579-87f9-6a0cdff0d8c1-kube-api-access-lhkdq\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.845676 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" event={"ID":"9e02b084-943e-4579-87f9-6a0cdff0d8c1","Type":"ContainerDied","Data":"65ebc6ecce483246877b4cfb2ac45f58bc105fe39edf95e71203e625e9051887"} Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.845988 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ebc6ecce483246877b4cfb2ac45f58bc105fe39edf95e71203e625e9051887" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.845799 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bdr55" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.940286 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk"] Feb 18 06:18:58 crc kubenswrapper[4869]: E0218 06:18:58.940679 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e02b084-943e-4579-87f9-6a0cdff0d8c1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.940699 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e02b084-943e-4579-87f9-6a0cdff0d8c1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.940917 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e02b084-943e-4579-87f9-6a0cdff0d8c1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.941536 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.943868 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.945052 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.945240 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.945451 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.945858 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:18:58 crc kubenswrapper[4869]: I0218 06:18:58.966890 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk"] Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.128484 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.128561 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.128612 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.128639 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zlw\" (UniqueName: \"kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.128694 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.230237 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.230308 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.230355 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.230391 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zlw\" (UniqueName: \"kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.230445 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.231558 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.234791 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.235830 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.245340 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.247006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zlw\" (UniqueName: \"kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqrfk\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.258671 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.772656 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk"] Feb 18 06:18:59 crc kubenswrapper[4869]: I0218 06:18:59.855636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" event={"ID":"b8020d5a-e997-4376-bef7-488e40f51277","Type":"ContainerStarted","Data":"85aebfce8420ad5d4bad4eaa9b22a3dd884aa76c84e4ac92ab2a78fb30f268d1"} Feb 18 06:19:00 crc kubenswrapper[4869]: I0218 06:19:00.865817 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" event={"ID":"b8020d5a-e997-4376-bef7-488e40f51277","Type":"ContainerStarted","Data":"df2aa81e2d697f77364cddc45c5354633ccdc7de949b900644501e9853b94b87"} Feb 18 06:19:00 crc kubenswrapper[4869]: I0218 06:19:00.914587 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" podStartSLOduration=2.498289368 podStartE2EDuration="2.914565297s" podCreationTimestamp="2026-02-18 06:18:58 +0000 UTC" firstStartedPulling="2026-02-18 06:18:59.778054212 +0000 UTC m=+1836.947142444" lastFinishedPulling="2026-02-18 06:19:00.194330141 +0000 UTC m=+1837.363418373" observedRunningTime="2026-02-18 06:19:00.889853845 +0000 UTC m=+1838.058942097" watchObservedRunningTime="2026-02-18 06:19:00.914565297 +0000 UTC m=+1838.083653529" Feb 18 06:19:58 crc kubenswrapper[4869]: I0218 06:19:58.843587 4869 generic.go:334] "Generic (PLEG): container finished" podID="b8020d5a-e997-4376-bef7-488e40f51277" containerID="df2aa81e2d697f77364cddc45c5354633ccdc7de949b900644501e9853b94b87" exitCode=0 Feb 18 06:19:58 crc kubenswrapper[4869]: I0218 06:19:58.843668 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" event={"ID":"b8020d5a-e997-4376-bef7-488e40f51277","Type":"ContainerDied","Data":"df2aa81e2d697f77364cddc45c5354633ccdc7de949b900644501e9853b94b87"} Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.278644 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.364757 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam\") pod \"b8020d5a-e997-4376-bef7-488e40f51277\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.365142 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0\") pod \"b8020d5a-e997-4376-bef7-488e40f51277\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.365263 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory\") pod \"b8020d5a-e997-4376-bef7-488e40f51277\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.365341 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle\") pod \"b8020d5a-e997-4376-bef7-488e40f51277\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.365449 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2zlw\" (UniqueName: \"kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw\") pod \"b8020d5a-e997-4376-bef7-488e40f51277\" (UID: \"b8020d5a-e997-4376-bef7-488e40f51277\") " Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.370670 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw" (OuterVolumeSpecName: "kube-api-access-h2zlw") pod "b8020d5a-e997-4376-bef7-488e40f51277" (UID: "b8020d5a-e997-4376-bef7-488e40f51277"). InnerVolumeSpecName "kube-api-access-h2zlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.378972 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b8020d5a-e997-4376-bef7-488e40f51277" (UID: "b8020d5a-e997-4376-bef7-488e40f51277"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.397107 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b8020d5a-e997-4376-bef7-488e40f51277" (UID: "b8020d5a-e997-4376-bef7-488e40f51277"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.399332 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b8020d5a-e997-4376-bef7-488e40f51277" (UID: "b8020d5a-e997-4376-bef7-488e40f51277"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.399912 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory" (OuterVolumeSpecName: "inventory") pod "b8020d5a-e997-4376-bef7-488e40f51277" (UID: "b8020d5a-e997-4376-bef7-488e40f51277"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.467703 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.467737 4869 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b8020d5a-e997-4376-bef7-488e40f51277-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.467766 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.467775 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8020d5a-e997-4376-bef7-488e40f51277-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.467784 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2zlw\" (UniqueName: \"kubernetes.io/projected/b8020d5a-e997-4376-bef7-488e40f51277-kube-api-access-h2zlw\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.858735 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" event={"ID":"b8020d5a-e997-4376-bef7-488e40f51277","Type":"ContainerDied","Data":"85aebfce8420ad5d4bad4eaa9b22a3dd884aa76c84e4ac92ab2a78fb30f268d1"} Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.859091 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85aebfce8420ad5d4bad4eaa9b22a3dd884aa76c84e4ac92ab2a78fb30f268d1" Feb 18 06:20:00 crc kubenswrapper[4869]: I0218 06:20:00.858971 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqrfk" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.033639 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw"] Feb 18 06:20:01 crc kubenswrapper[4869]: E0218 06:20:01.034107 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8020d5a-e997-4376-bef7-488e40f51277" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.034130 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8020d5a-e997-4376-bef7-488e40f51277" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.034383 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8020d5a-e997-4376-bef7-488e40f51277" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.035135 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.037838 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.038550 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.038881 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.039091 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.039538 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.039829 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.045434 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw"] Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185439 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185499 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185536 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185665 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5nhc\" (UniqueName: \"kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.185689 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.287948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.288020 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.288070 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.288099 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.288165 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5nhc\" (UniqueName: \"kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.288186 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.292956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.292976 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.293007 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.293569 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.294052 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.309142 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5nhc\" (UniqueName: \"kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.356985 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.357356 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.359217 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.371732 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.515413 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d56v\" (UniqueName: \"kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.516433 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.516668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.618300 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.618428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.618499 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d56v\" (UniqueName: \"kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.619045 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.619334 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.639038 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d56v\" (UniqueName: \"kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v\") pod \"redhat-marketplace-h8rg8\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.793977 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:01 crc kubenswrapper[4869]: I0218 06:20:01.976556 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw"] Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.239304 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:02 crc kubenswrapper[4869]: W0218 06:20:02.241116 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod813bec50_cc13_428c_a6be_1c71e85f1fc3.slice/crio-dc2eb8d0e2e401552d34750dc5e1a2b72ded1b3c9fc3504a6499bba4f8d15928 WatchSource:0}: Error finding container dc2eb8d0e2e401552d34750dc5e1a2b72ded1b3c9fc3504a6499bba4f8d15928: Status 404 returned error can't find the container with id dc2eb8d0e2e401552d34750dc5e1a2b72ded1b3c9fc3504a6499bba4f8d15928 Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.876208 4869 generic.go:334] "Generic (PLEG): container finished" podID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerID="1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805" exitCode=0 Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.877149 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerDied","Data":"1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805"} Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.877183 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerStarted","Data":"dc2eb8d0e2e401552d34750dc5e1a2b72ded1b3c9fc3504a6499bba4f8d15928"} Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.882281 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" event={"ID":"4c96cef9-45b8-4639-a368-063acac72c83","Type":"ContainerStarted","Data":"7740ddba409123a48a7cacd055acd52c7fdcc7a26f29165e295b3472183632d1"} Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.882330 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" event={"ID":"4c96cef9-45b8-4639-a368-063acac72c83","Type":"ContainerStarted","Data":"b26c73748d5d815bceacf55dcae48970960fa8bf0557ac8d3aaa36863665a5bd"} Feb 18 06:20:02 crc kubenswrapper[4869]: I0218 06:20:02.919720 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" podStartSLOduration=1.462530114 podStartE2EDuration="1.919701215s" podCreationTimestamp="2026-02-18 06:20:01 +0000 UTC" firstStartedPulling="2026-02-18 06:20:01.991627068 +0000 UTC m=+1899.160715300" lastFinishedPulling="2026-02-18 06:20:02.448798169 +0000 UTC m=+1899.617886401" observedRunningTime="2026-02-18 06:20:02.913928938 +0000 UTC m=+1900.083017180" watchObservedRunningTime="2026-02-18 06:20:02.919701215 +0000 UTC m=+1900.088789447" Feb 18 06:20:04 crc kubenswrapper[4869]: I0218 06:20:04.902464 4869 generic.go:334] "Generic (PLEG): container finished" podID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerID="2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488" exitCode=0 Feb 18 06:20:04 crc kubenswrapper[4869]: I0218 06:20:04.902725 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerDied","Data":"2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488"} Feb 18 06:20:05 crc kubenswrapper[4869]: I0218 06:20:05.914337 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerStarted","Data":"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099"} Feb 18 06:20:05 crc kubenswrapper[4869]: I0218 06:20:05.941257 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h8rg8" podStartSLOduration=2.540394857 podStartE2EDuration="4.941236393s" podCreationTimestamp="2026-02-18 06:20:01 +0000 UTC" firstStartedPulling="2026-02-18 06:20:02.879340895 +0000 UTC m=+1900.048429117" lastFinishedPulling="2026-02-18 06:20:05.280182431 +0000 UTC m=+1902.449270653" observedRunningTime="2026-02-18 06:20:05.931819351 +0000 UTC m=+1903.100907603" watchObservedRunningTime="2026-02-18 06:20:05.941236393 +0000 UTC m=+1903.110324625" Feb 18 06:20:11 crc kubenswrapper[4869]: I0218 06:20:11.794602 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:11 crc kubenswrapper[4869]: I0218 06:20:11.795358 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:11 crc kubenswrapper[4869]: I0218 06:20:11.848641 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:12 crc kubenswrapper[4869]: I0218 06:20:12.035817 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:12 crc kubenswrapper[4869]: I0218 06:20:12.104576 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:13 crc kubenswrapper[4869]: I0218 06:20:13.993831 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h8rg8" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="registry-server" containerID="cri-o://b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099" gracePeriod=2 Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.402300 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.567301 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content\") pod \"813bec50-cc13-428c-a6be-1c71e85f1fc3\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.567400 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities\") pod \"813bec50-cc13-428c-a6be-1c71e85f1fc3\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.567503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d56v\" (UniqueName: \"kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v\") pod \"813bec50-cc13-428c-a6be-1c71e85f1fc3\" (UID: \"813bec50-cc13-428c-a6be-1c71e85f1fc3\") " Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.571046 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities" (OuterVolumeSpecName: "utilities") pod "813bec50-cc13-428c-a6be-1c71e85f1fc3" (UID: "813bec50-cc13-428c-a6be-1c71e85f1fc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.579863 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v" (OuterVolumeSpecName: "kube-api-access-8d56v") pod "813bec50-cc13-428c-a6be-1c71e85f1fc3" (UID: "813bec50-cc13-428c-a6be-1c71e85f1fc3"). InnerVolumeSpecName "kube-api-access-8d56v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.669359 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.669406 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d56v\" (UniqueName: \"kubernetes.io/projected/813bec50-cc13-428c-a6be-1c71e85f1fc3-kube-api-access-8d56v\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.745589 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "813bec50-cc13-428c-a6be-1c71e85f1fc3" (UID: "813bec50-cc13-428c-a6be-1c71e85f1fc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:20:14 crc kubenswrapper[4869]: I0218 06:20:14.771027 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813bec50-cc13-428c-a6be-1c71e85f1fc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.003373 4869 generic.go:334] "Generic (PLEG): container finished" podID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerID="b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099" exitCode=0 Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.003413 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerDied","Data":"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099"} Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.003475 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8rg8" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.003494 4869 scope.go:117] "RemoveContainer" containerID="b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.003480 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8rg8" event={"ID":"813bec50-cc13-428c-a6be-1c71e85f1fc3","Type":"ContainerDied","Data":"dc2eb8d0e2e401552d34750dc5e1a2b72ded1b3c9fc3504a6499bba4f8d15928"} Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.023471 4869 scope.go:117] "RemoveContainer" containerID="2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.038868 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.069216 4869 scope.go:117] "RemoveContainer" containerID="1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.083168 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8rg8"] Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.100442 4869 scope.go:117] "RemoveContainer" containerID="b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099" Feb 18 06:20:15 crc kubenswrapper[4869]: E0218 06:20:15.100999 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099\": container with ID starting with b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099 not found: ID does not exist" containerID="b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.101055 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099"} err="failed to get container status \"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099\": rpc error: code = NotFound desc = could not find container \"b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099\": container with ID starting with b42acb77a7db63261d6c627992085e04df3fec3a63d73245b292fbea2b9a6099 not found: ID does not exist" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.101088 4869 scope.go:117] "RemoveContainer" containerID="2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488" Feb 18 06:20:15 crc kubenswrapper[4869]: E0218 06:20:15.101399 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488\": container with ID starting with 2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488 not found: ID does not exist" containerID="2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.101425 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488"} err="failed to get container status \"2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488\": rpc error: code = NotFound desc = could not find container \"2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488\": container with ID starting with 2097ba805d8b6f85fc7048a3bdaa71150fc5002e4c5b409a50f4d1090853c488 not found: ID does not exist" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.101443 4869 scope.go:117] "RemoveContainer" containerID="1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805" Feb 18 06:20:15 crc kubenswrapper[4869]: E0218 06:20:15.101943 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805\": container with ID starting with 1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805 not found: ID does not exist" containerID="1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.101970 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805"} err="failed to get container status \"1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805\": rpc error: code = NotFound desc = could not find container \"1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805\": container with ID starting with 1789d1294e0c2f7264aad059a574879c3cf3892830040f3dd04d4e3bfa3c2805 not found: ID does not exist" Feb 18 06:20:15 crc kubenswrapper[4869]: I0218 06:20:15.480055 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" path="/var/lib/kubelet/pods/813bec50-cc13-428c-a6be-1c71e85f1fc3/volumes" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.304398 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:20:37 crc kubenswrapper[4869]: E0218 06:20:37.305290 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="extract-content" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.305303 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="extract-content" Feb 18 06:20:37 crc kubenswrapper[4869]: E0218 06:20:37.305322 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="registry-server" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.305329 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="registry-server" Feb 18 06:20:37 crc kubenswrapper[4869]: E0218 06:20:37.305343 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="extract-utilities" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.305350 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="extract-utilities" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.305528 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="813bec50-cc13-428c-a6be-1c71e85f1fc3" containerName="registry-server" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.306881 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.311934 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.394898 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.394967 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47cw\" (UniqueName: \"kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.395031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.496455 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47cw\" (UniqueName: \"kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.496725 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.496944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.497285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.497316 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.522234 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47cw\" (UniqueName: \"kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw\") pod \"redhat-operators-kdp9c\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:37 crc kubenswrapper[4869]: I0218 06:20:37.637231 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:38 crc kubenswrapper[4869]: I0218 06:20:38.129335 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:20:38 crc kubenswrapper[4869]: I0218 06:20:38.218667 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerStarted","Data":"ba380f81d0346a2eb522de3f7eea4f58972212fa14ef15a072faae5220576b4e"} Feb 18 06:20:39 crc kubenswrapper[4869]: I0218 06:20:39.226942 4869 generic.go:334] "Generic (PLEG): container finished" podID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerID="8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d" exitCode=0 Feb 18 06:20:39 crc kubenswrapper[4869]: I0218 06:20:39.227037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerDied","Data":"8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d"} Feb 18 06:20:40 crc kubenswrapper[4869]: I0218 06:20:40.236199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerStarted","Data":"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928"} Feb 18 06:20:41 crc kubenswrapper[4869]: I0218 06:20:41.254411 4869 generic.go:334] "Generic (PLEG): container finished" podID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerID="1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928" exitCode=0 Feb 18 06:20:41 crc kubenswrapper[4869]: I0218 06:20:41.254451 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerDied","Data":"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928"} Feb 18 06:20:43 crc kubenswrapper[4869]: I0218 06:20:43.269635 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerStarted","Data":"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887"} Feb 18 06:20:43 crc kubenswrapper[4869]: I0218 06:20:43.293128 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kdp9c" podStartSLOduration=3.779293801 podStartE2EDuration="6.293106847s" podCreationTimestamp="2026-02-18 06:20:37 +0000 UTC" firstStartedPulling="2026-02-18 06:20:39.230094732 +0000 UTC m=+1936.399182964" lastFinishedPulling="2026-02-18 06:20:41.743907778 +0000 UTC m=+1938.912996010" observedRunningTime="2026-02-18 06:20:43.285537579 +0000 UTC m=+1940.454625811" watchObservedRunningTime="2026-02-18 06:20:43.293106847 +0000 UTC m=+1940.462195099" Feb 18 06:20:47 crc kubenswrapper[4869]: I0218 06:20:47.638188 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:47 crc kubenswrapper[4869]: I0218 06:20:47.638893 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:48 crc kubenswrapper[4869]: I0218 06:20:48.697671 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kdp9c" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="registry-server" probeResult="failure" output=< Feb 18 06:20:48 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:20:48 crc kubenswrapper[4869]: > Feb 18 06:20:49 crc kubenswrapper[4869]: I0218 06:20:49.326064 4869 generic.go:334] "Generic (PLEG): container finished" podID="4c96cef9-45b8-4639-a368-063acac72c83" containerID="7740ddba409123a48a7cacd055acd52c7fdcc7a26f29165e295b3472183632d1" exitCode=0 Feb 18 06:20:49 crc kubenswrapper[4869]: I0218 06:20:49.326151 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" event={"ID":"4c96cef9-45b8-4639-a368-063acac72c83","Type":"ContainerDied","Data":"7740ddba409123a48a7cacd055acd52c7fdcc7a26f29165e295b3472183632d1"} Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.796772 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.862704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.862826 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.862858 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.862912 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5nhc\" (UniqueName: \"kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.862970 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.863018 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle\") pod \"4c96cef9-45b8-4639-a368-063acac72c83\" (UID: \"4c96cef9-45b8-4639-a368-063acac72c83\") " Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.869159 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.870776 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc" (OuterVolumeSpecName: "kube-api-access-f5nhc") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "kube-api-access-f5nhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.892207 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.896825 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.897370 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.915185 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory" (OuterVolumeSpecName: "inventory") pod "4c96cef9-45b8-4639-a368-063acac72c83" (UID: "4c96cef9-45b8-4639-a368-063acac72c83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965472 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965505 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965518 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965528 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5nhc\" (UniqueName: \"kubernetes.io/projected/4c96cef9-45b8-4639-a368-063acac72c83-kube-api-access-f5nhc\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965540 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:50 crc kubenswrapper[4869]: I0218 06:20:50.965549 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c96cef9-45b8-4639-a368-063acac72c83-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.355255 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" event={"ID":"4c96cef9-45b8-4639-a368-063acac72c83","Type":"ContainerDied","Data":"b26c73748d5d815bceacf55dcae48970960fa8bf0557ac8d3aaa36863665a5bd"} Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.355298 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b26c73748d5d815bceacf55dcae48970960fa8bf0557ac8d3aaa36863665a5bd" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.355968 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.460343 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w"] Feb 18 06:20:51 crc kubenswrapper[4869]: E0218 06:20:51.461196 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c96cef9-45b8-4639-a368-063acac72c83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.461276 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c96cef9-45b8-4639-a368-063acac72c83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.461684 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c96cef9-45b8-4639-a368-063acac72c83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.462794 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.465731 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.465912 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.466144 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.466559 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.466693 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.522616 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w"] Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.580485 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.580575 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2q4d\" (UniqueName: \"kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.580684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.580840 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.580879 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.683186 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.683860 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2q4d\" (UniqueName: \"kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.684377 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.686022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.687335 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.689820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.692794 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.697147 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.697700 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.704733 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2q4d\" (UniqueName: \"kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:51 crc kubenswrapper[4869]: I0218 06:20:51.789278 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:20:52 crc kubenswrapper[4869]: I0218 06:20:52.291841 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w"] Feb 18 06:20:52 crc kubenswrapper[4869]: W0218 06:20:52.294647 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod162af8f6_3123_4d8b_a602_0b2808cd6654.slice/crio-d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27 WatchSource:0}: Error finding container d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27: Status 404 returned error can't find the container with id d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27 Feb 18 06:20:52 crc kubenswrapper[4869]: I0218 06:20:52.367646 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" event={"ID":"162af8f6-3123-4d8b-a602-0b2808cd6654","Type":"ContainerStarted","Data":"d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27"} Feb 18 06:20:53 crc kubenswrapper[4869]: I0218 06:20:53.380025 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" event={"ID":"162af8f6-3123-4d8b-a602-0b2808cd6654","Type":"ContainerStarted","Data":"970df7d1b503eefa1e02f9bee671fa59d924a2a70abe93e02d5fe78f9c054567"} Feb 18 06:20:53 crc kubenswrapper[4869]: I0218 06:20:53.404591 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" podStartSLOduration=1.9897014290000001 podStartE2EDuration="2.404574271s" podCreationTimestamp="2026-02-18 06:20:51 +0000 UTC" firstStartedPulling="2026-02-18 06:20:52.298157236 +0000 UTC m=+1949.467245498" lastFinishedPulling="2026-02-18 06:20:52.713030108 +0000 UTC m=+1949.882118340" observedRunningTime="2026-02-18 06:20:53.399152899 +0000 UTC m=+1950.568241141" watchObservedRunningTime="2026-02-18 06:20:53.404574271 +0000 UTC m=+1950.573662503" Feb 18 06:20:57 crc kubenswrapper[4869]: I0218 06:20:57.694664 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:57 crc kubenswrapper[4869]: I0218 06:20:57.741401 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:57 crc kubenswrapper[4869]: I0218 06:20:57.932467 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.425668 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kdp9c" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="registry-server" containerID="cri-o://8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887" gracePeriod=2 Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.845624 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.947504 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities\") pod \"f5615c2e-3446-4dc9-8323-c2d74199c085\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.947573 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content\") pod \"f5615c2e-3446-4dc9-8323-c2d74199c085\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.947591 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47cw\" (UniqueName: \"kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw\") pod \"f5615c2e-3446-4dc9-8323-c2d74199c085\" (UID: \"f5615c2e-3446-4dc9-8323-c2d74199c085\") " Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.948432 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities" (OuterVolumeSpecName: "utilities") pod "f5615c2e-3446-4dc9-8323-c2d74199c085" (UID: "f5615c2e-3446-4dc9-8323-c2d74199c085"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:20:59 crc kubenswrapper[4869]: I0218 06:20:59.954187 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw" (OuterVolumeSpecName: "kube-api-access-j47cw") pod "f5615c2e-3446-4dc9-8323-c2d74199c085" (UID: "f5615c2e-3446-4dc9-8323-c2d74199c085"). InnerVolumeSpecName "kube-api-access-j47cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.049758 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47cw\" (UniqueName: \"kubernetes.io/projected/f5615c2e-3446-4dc9-8323-c2d74199c085-kube-api-access-j47cw\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.049792 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.054076 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5615c2e-3446-4dc9-8323-c2d74199c085" (UID: "f5615c2e-3446-4dc9-8323-c2d74199c085"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.151863 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5615c2e-3446-4dc9-8323-c2d74199c085-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.435331 4869 generic.go:334] "Generic (PLEG): container finished" podID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerID="8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887" exitCode=0 Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.435414 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdp9c" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.435406 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerDied","Data":"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887"} Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.435525 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdp9c" event={"ID":"f5615c2e-3446-4dc9-8323-c2d74199c085","Type":"ContainerDied","Data":"ba380f81d0346a2eb522de3f7eea4f58972212fa14ef15a072faae5220576b4e"} Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.435541 4869 scope.go:117] "RemoveContainer" containerID="8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.453010 4869 scope.go:117] "RemoveContainer" containerID="1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.471536 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.479964 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kdp9c"] Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.494275 4869 scope.go:117] "RemoveContainer" containerID="8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.525226 4869 scope.go:117] "RemoveContainer" containerID="8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887" Feb 18 06:21:00 crc kubenswrapper[4869]: E0218 06:21:00.525787 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887\": container with ID starting with 8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887 not found: ID does not exist" containerID="8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.525887 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887"} err="failed to get container status \"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887\": rpc error: code = NotFound desc = could not find container \"8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887\": container with ID starting with 8356e8511aa8a6c7f4fcc1517b7f3019a6d8a2f01e3f5b6b6472e0fb97032887 not found: ID does not exist" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.525965 4869 scope.go:117] "RemoveContainer" containerID="1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928" Feb 18 06:21:00 crc kubenswrapper[4869]: E0218 06:21:00.526325 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928\": container with ID starting with 1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928 not found: ID does not exist" containerID="1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.526368 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928"} err="failed to get container status \"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928\": rpc error: code = NotFound desc = could not find container \"1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928\": container with ID starting with 1bb00552d901ad74d900c90f60022cb77f7e641083e296b016891dae01ab5928 not found: ID does not exist" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.526394 4869 scope.go:117] "RemoveContainer" containerID="8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d" Feb 18 06:21:00 crc kubenswrapper[4869]: E0218 06:21:00.526680 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d\": container with ID starting with 8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d not found: ID does not exist" containerID="8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d" Feb 18 06:21:00 crc kubenswrapper[4869]: I0218 06:21:00.526792 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d"} err="failed to get container status \"8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d\": rpc error: code = NotFound desc = could not find container \"8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d\": container with ID starting with 8c6feefaed13ef862cd718ee431a86ee32f134b9a9abb0430d057f4b6ba99a4d not found: ID does not exist" Feb 18 06:21:01 crc kubenswrapper[4869]: I0218 06:21:01.488566 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" path="/var/lib/kubelet/pods/f5615c2e-3446-4dc9-8323-c2d74199c085/volumes" Feb 18 06:21:10 crc kubenswrapper[4869]: I0218 06:21:10.132946 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:21:10 crc kubenswrapper[4869]: I0218 06:21:10.134108 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:21:40 crc kubenswrapper[4869]: I0218 06:21:40.132358 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:21:40 crc kubenswrapper[4869]: I0218 06:21:40.132930 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:22:10 crc kubenswrapper[4869]: I0218 06:22:10.485890 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:22:10 crc kubenswrapper[4869]: I0218 06:22:10.486545 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:22:10 crc kubenswrapper[4869]: I0218 06:22:10.486602 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:22:10 crc kubenswrapper[4869]: I0218 06:22:10.487555 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:22:10 crc kubenswrapper[4869]: I0218 06:22:10.487836 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c" gracePeriod=600 Feb 18 06:22:12 crc kubenswrapper[4869]: I0218 06:22:12.218702 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c" exitCode=0 Feb 18 06:22:12 crc kubenswrapper[4869]: I0218 06:22:12.219208 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c"} Feb 18 06:22:12 crc kubenswrapper[4869]: I0218 06:22:12.219243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad"} Feb 18 06:22:12 crc kubenswrapper[4869]: I0218 06:22:12.219265 4869 scope.go:117] "RemoveContainer" containerID="6dbe4c4facb96a0748fed59d72af1a790cec853b6a2317a30ac4ece4ddc133b7" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.360768 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:35 crc kubenswrapper[4869]: E0218 06:23:35.361695 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="extract-content" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.361712 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="extract-content" Feb 18 06:23:35 crc kubenswrapper[4869]: E0218 06:23:35.361774 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="extract-utilities" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.361783 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="extract-utilities" Feb 18 06:23:35 crc kubenswrapper[4869]: E0218 06:23:35.361801 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="registry-server" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.361810 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="registry-server" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.362015 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5615c2e-3446-4dc9-8323-c2d74199c085" containerName="registry-server" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.363848 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.369873 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.419903 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.419971 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx7gp\" (UniqueName: \"kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.420006 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.522307 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.522392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx7gp\" (UniqueName: \"kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.522476 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.523332 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.523477 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.554000 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx7gp\" (UniqueName: \"kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp\") pod \"certified-operators-5prcg\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:35 crc kubenswrapper[4869]: I0218 06:23:35.694122 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:36 crc kubenswrapper[4869]: I0218 06:23:36.234755 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:37 crc kubenswrapper[4869]: I0218 06:23:37.040214 4869 generic.go:334] "Generic (PLEG): container finished" podID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerID="ba3f2c6ddf3f97a8972a764f12363b424304a7d6f25ad6b01462fda185a26938" exitCode=0 Feb 18 06:23:37 crc kubenswrapper[4869]: I0218 06:23:37.040312 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerDied","Data":"ba3f2c6ddf3f97a8972a764f12363b424304a7d6f25ad6b01462fda185a26938"} Feb 18 06:23:37 crc kubenswrapper[4869]: I0218 06:23:37.040933 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerStarted","Data":"8c6c4d3ad58505c7cba3d41c792e479d25cc7bda4187f5671fae0dfc935eeda6"} Feb 18 06:23:37 crc kubenswrapper[4869]: I0218 06:23:37.042599 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:23:38 crc kubenswrapper[4869]: I0218 06:23:38.051603 4869 generic.go:334] "Generic (PLEG): container finished" podID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerID="e4b74a4f97a96dbc1264c042900f7f10e4f07928c3823ac0f151465764c7ca2f" exitCode=0 Feb 18 06:23:38 crc kubenswrapper[4869]: I0218 06:23:38.051800 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerDied","Data":"e4b74a4f97a96dbc1264c042900f7f10e4f07928c3823ac0f151465764c7ca2f"} Feb 18 06:23:39 crc kubenswrapper[4869]: I0218 06:23:39.064786 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerStarted","Data":"c3a83ed2013778ca4c4472b38aeab9f7363071175ce9c3d48a01c893d375ea82"} Feb 18 06:23:39 crc kubenswrapper[4869]: I0218 06:23:39.089645 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5prcg" podStartSLOduration=2.708014647 podStartE2EDuration="4.089624786s" podCreationTimestamp="2026-02-18 06:23:35 +0000 UTC" firstStartedPulling="2026-02-18 06:23:37.042375013 +0000 UTC m=+2114.211463245" lastFinishedPulling="2026-02-18 06:23:38.423985142 +0000 UTC m=+2115.593073384" observedRunningTime="2026-02-18 06:23:39.086836378 +0000 UTC m=+2116.255924630" watchObservedRunningTime="2026-02-18 06:23:39.089624786 +0000 UTC m=+2116.258713018" Feb 18 06:23:45 crc kubenswrapper[4869]: I0218 06:23:45.694601 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:45 crc kubenswrapper[4869]: I0218 06:23:45.695184 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:45 crc kubenswrapper[4869]: I0218 06:23:45.746110 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:46 crc kubenswrapper[4869]: I0218 06:23:46.187238 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:46 crc kubenswrapper[4869]: I0218 06:23:46.239352 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:48 crc kubenswrapper[4869]: I0218 06:23:48.146545 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5prcg" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="registry-server" containerID="cri-o://c3a83ed2013778ca4c4472b38aeab9f7363071175ce9c3d48a01c893d375ea82" gracePeriod=2 Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.155807 4869 generic.go:334] "Generic (PLEG): container finished" podID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerID="c3a83ed2013778ca4c4472b38aeab9f7363071175ce9c3d48a01c893d375ea82" exitCode=0 Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.155885 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerDied","Data":"c3a83ed2013778ca4c4472b38aeab9f7363071175ce9c3d48a01c893d375ea82"} Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.759077 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.833680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content\") pod \"5c20f376-3c21-4602-98db-9cd22a0db0ad\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.833730 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities\") pod \"5c20f376-3c21-4602-98db-9cd22a0db0ad\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.833897 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx7gp\" (UniqueName: \"kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp\") pod \"5c20f376-3c21-4602-98db-9cd22a0db0ad\" (UID: \"5c20f376-3c21-4602-98db-9cd22a0db0ad\") " Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.834886 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities" (OuterVolumeSpecName: "utilities") pod "5c20f376-3c21-4602-98db-9cd22a0db0ad" (UID: "5c20f376-3c21-4602-98db-9cd22a0db0ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.843235 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp" (OuterVolumeSpecName: "kube-api-access-nx7gp") pod "5c20f376-3c21-4602-98db-9cd22a0db0ad" (UID: "5c20f376-3c21-4602-98db-9cd22a0db0ad"). InnerVolumeSpecName "kube-api-access-nx7gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.893211 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c20f376-3c21-4602-98db-9cd22a0db0ad" (UID: "5c20f376-3c21-4602-98db-9cd22a0db0ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.936544 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx7gp\" (UniqueName: \"kubernetes.io/projected/5c20f376-3c21-4602-98db-9cd22a0db0ad-kube-api-access-nx7gp\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.936579 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:49 crc kubenswrapper[4869]: I0218 06:23:49.936589 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c20f376-3c21-4602-98db-9cd22a0db0ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.167043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5prcg" event={"ID":"5c20f376-3c21-4602-98db-9cd22a0db0ad","Type":"ContainerDied","Data":"8c6c4d3ad58505c7cba3d41c792e479d25cc7bda4187f5671fae0dfc935eeda6"} Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.167101 4869 scope.go:117] "RemoveContainer" containerID="c3a83ed2013778ca4c4472b38aeab9f7363071175ce9c3d48a01c893d375ea82" Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.167127 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5prcg" Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.186841 4869 scope.go:117] "RemoveContainer" containerID="e4b74a4f97a96dbc1264c042900f7f10e4f07928c3823ac0f151465764c7ca2f" Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.202368 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.211261 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5prcg"] Feb 18 06:23:50 crc kubenswrapper[4869]: I0218 06:23:50.232635 4869 scope.go:117] "RemoveContainer" containerID="ba3f2c6ddf3f97a8972a764f12363b424304a7d6f25ad6b01462fda185a26938" Feb 18 06:23:51 crc kubenswrapper[4869]: I0218 06:23:51.479836 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" path="/var/lib/kubelet/pods/5c20f376-3c21-4602-98db-9cd22a0db0ad/volumes" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.357878 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfdnd"] Feb 18 06:24:15 crc kubenswrapper[4869]: E0218 06:24:15.358822 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="extract-content" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.358836 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="extract-content" Feb 18 06:24:15 crc kubenswrapper[4869]: E0218 06:24:15.358862 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="registry-server" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.358868 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="registry-server" Feb 18 06:24:15 crc kubenswrapper[4869]: E0218 06:24:15.358887 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="extract-utilities" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.358893 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="extract-utilities" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.359077 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c20f376-3c21-4602-98db-9cd22a0db0ad" containerName="registry-server" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.360375 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.383636 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfdnd"] Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.445729 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-catalog-content\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.446139 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdbd\" (UniqueName: \"kubernetes.io/projected/dd620fad-d652-4a98-95d7-8470686b4219-kube-api-access-9qdbd\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.446206 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-utilities\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.548040 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-catalog-content\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.548420 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdbd\" (UniqueName: \"kubernetes.io/projected/dd620fad-d652-4a98-95d7-8470686b4219-kube-api-access-9qdbd\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.548606 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-catalog-content\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.548607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-utilities\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.549056 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd620fad-d652-4a98-95d7-8470686b4219-utilities\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.567531 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdbd\" (UniqueName: \"kubernetes.io/projected/dd620fad-d652-4a98-95d7-8470686b4219-kube-api-access-9qdbd\") pod \"community-operators-dfdnd\" (UID: \"dd620fad-d652-4a98-95d7-8470686b4219\") " pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:15 crc kubenswrapper[4869]: I0218 06:24:15.681776 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:16 crc kubenswrapper[4869]: I0218 06:24:16.217138 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfdnd"] Feb 18 06:24:16 crc kubenswrapper[4869]: I0218 06:24:16.401607 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfdnd" event={"ID":"dd620fad-d652-4a98-95d7-8470686b4219","Type":"ContainerStarted","Data":"cce83d6daddffd70acac6eac4d1ad09e218b0e864313692110901924563871c0"} Feb 18 06:24:17 crc kubenswrapper[4869]: I0218 06:24:17.411654 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd620fad-d652-4a98-95d7-8470686b4219" containerID="6c1ad69b3f7fc5d4809aa1f8dca09fff3b3aa385cf87effdb8ed090844921c4d" exitCode=0 Feb 18 06:24:17 crc kubenswrapper[4869]: I0218 06:24:17.411696 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfdnd" event={"ID":"dd620fad-d652-4a98-95d7-8470686b4219","Type":"ContainerDied","Data":"6c1ad69b3f7fc5d4809aa1f8dca09fff3b3aa385cf87effdb8ed090844921c4d"} Feb 18 06:24:22 crc kubenswrapper[4869]: I0218 06:24:22.459231 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd620fad-d652-4a98-95d7-8470686b4219" containerID="497b5de9b2d49ebecdc4e1fdf16c63a361052e1c030f31ee3c63f86608aa6577" exitCode=0 Feb 18 06:24:22 crc kubenswrapper[4869]: I0218 06:24:22.461254 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfdnd" event={"ID":"dd620fad-d652-4a98-95d7-8470686b4219","Type":"ContainerDied","Data":"497b5de9b2d49ebecdc4e1fdf16c63a361052e1c030f31ee3c63f86608aa6577"} Feb 18 06:24:23 crc kubenswrapper[4869]: I0218 06:24:23.481822 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfdnd" event={"ID":"dd620fad-d652-4a98-95d7-8470686b4219","Type":"ContainerStarted","Data":"f4cb66938050961592c3b1d94a0525c80d39bd0d046143ab745df5e1299f309c"} Feb 18 06:24:23 crc kubenswrapper[4869]: I0218 06:24:23.512100 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfdnd" podStartSLOduration=2.81312514 podStartE2EDuration="8.51207336s" podCreationTimestamp="2026-02-18 06:24:15 +0000 UTC" firstStartedPulling="2026-02-18 06:24:17.414056872 +0000 UTC m=+2154.583145104" lastFinishedPulling="2026-02-18 06:24:23.113005062 +0000 UTC m=+2160.282093324" observedRunningTime="2026-02-18 06:24:23.506633618 +0000 UTC m=+2160.675721860" watchObservedRunningTime="2026-02-18 06:24:23.51207336 +0000 UTC m=+2160.681161592" Feb 18 06:24:25 crc kubenswrapper[4869]: I0218 06:24:25.682496 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:25 crc kubenswrapper[4869]: I0218 06:24:25.682888 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:26 crc kubenswrapper[4869]: I0218 06:24:26.730626 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dfdnd" podUID="dd620fad-d652-4a98-95d7-8470686b4219" containerName="registry-server" probeResult="failure" output=< Feb 18 06:24:26 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:24:26 crc kubenswrapper[4869]: > Feb 18 06:24:35 crc kubenswrapper[4869]: I0218 06:24:35.749104 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:35 crc kubenswrapper[4869]: I0218 06:24:35.810716 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfdnd" Feb 18 06:24:35 crc kubenswrapper[4869]: I0218 06:24:35.890441 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfdnd"] Feb 18 06:24:35 crc kubenswrapper[4869]: I0218 06:24:35.988611 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 06:24:35 crc kubenswrapper[4869]: I0218 06:24:35.988880 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7px69" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="registry-server" containerID="cri-o://7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719" gracePeriod=2 Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.538935 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7px69" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.631345 4869 generic.go:334] "Generic (PLEG): container finished" podID="26874db3-d22a-4b27-8691-df2a106444f4" containerID="7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719" exitCode=0 Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.631386 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerDied","Data":"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719"} Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.632822 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7px69" event={"ID":"26874db3-d22a-4b27-8691-df2a106444f4","Type":"ContainerDied","Data":"b7249fd5d31da43f885ab11aa72a409b634ebff1fcadd7ab8f94660091ddd117"} Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.632848 4869 scope.go:117] "RemoveContainer" containerID="7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.631405 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7px69" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.651675 4869 scope.go:117] "RemoveContainer" containerID="f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.672409 4869 scope.go:117] "RemoveContainer" containerID="015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.705964 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8r7s\" (UniqueName: \"kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s\") pod \"26874db3-d22a-4b27-8691-df2a106444f4\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.706253 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities\") pod \"26874db3-d22a-4b27-8691-df2a106444f4\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.706278 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content\") pod \"26874db3-d22a-4b27-8691-df2a106444f4\" (UID: \"26874db3-d22a-4b27-8691-df2a106444f4\") " Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.707015 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities" (OuterVolumeSpecName: "utilities") pod "26874db3-d22a-4b27-8691-df2a106444f4" (UID: "26874db3-d22a-4b27-8691-df2a106444f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.715145 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s" (OuterVolumeSpecName: "kube-api-access-p8r7s") pod "26874db3-d22a-4b27-8691-df2a106444f4" (UID: "26874db3-d22a-4b27-8691-df2a106444f4"). InnerVolumeSpecName "kube-api-access-p8r7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.718149 4869 scope.go:117] "RemoveContainer" containerID="7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719" Feb 18 06:24:36 crc kubenswrapper[4869]: E0218 06:24:36.718609 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719\": container with ID starting with 7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719 not found: ID does not exist" containerID="7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.718675 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719"} err="failed to get container status \"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719\": rpc error: code = NotFound desc = could not find container \"7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719\": container with ID starting with 7ad80f31a81ac9227e2d0c708d33ba3fd705111ef2faa6aea6a9bebce48e9719 not found: ID does not exist" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.718700 4869 scope.go:117] "RemoveContainer" containerID="f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67" Feb 18 06:24:36 crc kubenswrapper[4869]: E0218 06:24:36.719204 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67\": container with ID starting with f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67 not found: ID does not exist" containerID="f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.719257 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67"} err="failed to get container status \"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67\": rpc error: code = NotFound desc = could not find container \"f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67\": container with ID starting with f24f06bc81f446a148df6ee0d89b5591245b32843300eafeed6d64608edb2b67 not found: ID does not exist" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.719287 4869 scope.go:117] "RemoveContainer" containerID="015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c" Feb 18 06:24:36 crc kubenswrapper[4869]: E0218 06:24:36.719609 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c\": container with ID starting with 015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c not found: ID does not exist" containerID="015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.719635 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c"} err="failed to get container status \"015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c\": rpc error: code = NotFound desc = could not find container \"015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c\": container with ID starting with 015cd94f74f494b4a72736ed5e9a1cec0876345c5f984de38c243ea8e9808c3c not found: ID does not exist" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.763756 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26874db3-d22a-4b27-8691-df2a106444f4" (UID: "26874db3-d22a-4b27-8691-df2a106444f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.809499 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.809551 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26874db3-d22a-4b27-8691-df2a106444f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.809566 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8r7s\" (UniqueName: \"kubernetes.io/projected/26874db3-d22a-4b27-8691-df2a106444f4-kube-api-access-p8r7s\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.968078 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 06:24:36 crc kubenswrapper[4869]: I0218 06:24:36.975527 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7px69"] Feb 18 06:24:37 crc kubenswrapper[4869]: I0218 06:24:37.479552 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26874db3-d22a-4b27-8691-df2a106444f4" path="/var/lib/kubelet/pods/26874db3-d22a-4b27-8691-df2a106444f4/volumes" Feb 18 06:24:40 crc kubenswrapper[4869]: I0218 06:24:40.132384 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:24:40 crc kubenswrapper[4869]: I0218 06:24:40.132848 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:24:41 crc kubenswrapper[4869]: I0218 06:24:41.693522 4869 generic.go:334] "Generic (PLEG): container finished" podID="162af8f6-3123-4d8b-a602-0b2808cd6654" containerID="970df7d1b503eefa1e02f9bee671fa59d924a2a70abe93e02d5fe78f9c054567" exitCode=0 Feb 18 06:24:41 crc kubenswrapper[4869]: I0218 06:24:41.693681 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" event={"ID":"162af8f6-3123-4d8b-a602-0b2808cd6654","Type":"ContainerDied","Data":"970df7d1b503eefa1e02f9bee671fa59d924a2a70abe93e02d5fe78f9c054567"} Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.097990 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.241907 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle\") pod \"162af8f6-3123-4d8b-a602-0b2808cd6654\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.242066 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory\") pod \"162af8f6-3123-4d8b-a602-0b2808cd6654\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.242137 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2q4d\" (UniqueName: \"kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d\") pod \"162af8f6-3123-4d8b-a602-0b2808cd6654\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.242369 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0\") pod \"162af8f6-3123-4d8b-a602-0b2808cd6654\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.242489 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam\") pod \"162af8f6-3123-4d8b-a602-0b2808cd6654\" (UID: \"162af8f6-3123-4d8b-a602-0b2808cd6654\") " Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.249218 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d" (OuterVolumeSpecName: "kube-api-access-j2q4d") pod "162af8f6-3123-4d8b-a602-0b2808cd6654" (UID: "162af8f6-3123-4d8b-a602-0b2808cd6654"). InnerVolumeSpecName "kube-api-access-j2q4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.249889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "162af8f6-3123-4d8b-a602-0b2808cd6654" (UID: "162af8f6-3123-4d8b-a602-0b2808cd6654"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.273135 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory" (OuterVolumeSpecName: "inventory") pod "162af8f6-3123-4d8b-a602-0b2808cd6654" (UID: "162af8f6-3123-4d8b-a602-0b2808cd6654"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.273963 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "162af8f6-3123-4d8b-a602-0b2808cd6654" (UID: "162af8f6-3123-4d8b-a602-0b2808cd6654"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.300592 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "162af8f6-3123-4d8b-a602-0b2808cd6654" (UID: "162af8f6-3123-4d8b-a602-0b2808cd6654"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.345816 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.345854 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.345870 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.345881 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/162af8f6-3123-4d8b-a602-0b2808cd6654-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.345890 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2q4d\" (UniqueName: \"kubernetes.io/projected/162af8f6-3123-4d8b-a602-0b2808cd6654-kube-api-access-j2q4d\") on node \"crc\" DevicePath \"\"" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.727776 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" event={"ID":"162af8f6-3123-4d8b-a602-0b2808cd6654","Type":"ContainerDied","Data":"d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27"} Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.728658 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8cd26a49d0848ac64cda6c5e3c6703de489006eb20d04fc5fc4cddf9863fc27" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.728690 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.838072 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4"] Feb 18 06:24:43 crc kubenswrapper[4869]: E0218 06:24:43.838668 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="registry-server" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.838691 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="registry-server" Feb 18 06:24:43 crc kubenswrapper[4869]: E0218 06:24:43.838712 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="extract-content" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.838719 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="extract-content" Feb 18 06:24:43 crc kubenswrapper[4869]: E0218 06:24:43.838779 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162af8f6-3123-4d8b-a602-0b2808cd6654" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.838790 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="162af8f6-3123-4d8b-a602-0b2808cd6654" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:24:43 crc kubenswrapper[4869]: E0218 06:24:43.838806 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="extract-utilities" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.838815 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="extract-utilities" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.839012 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="162af8f6-3123-4d8b-a602-0b2808cd6654" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.839040 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="26874db3-d22a-4b27-8691-df2a106444f4" containerName="registry-server" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.839871 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.842120 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.842395 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.842949 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.843033 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.844253 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.844566 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.844998 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.853968 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4"] Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855383 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855446 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855493 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855557 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855574 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855601 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkmm\" (UniqueName: \"kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855623 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855876 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.855988 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959456 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959635 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkmm\" (UniqueName: \"kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959716 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959814 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959862 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959950 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.959982 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.960043 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.963687 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.965474 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.967199 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.967236 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.969036 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.969282 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.969772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.970395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.970440 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.972420 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:43 crc kubenswrapper[4869]: I0218 06:24:43.980339 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkmm\" (UniqueName: \"kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8p8x4\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:44 crc kubenswrapper[4869]: I0218 06:24:44.160905 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:24:44 crc kubenswrapper[4869]: I0218 06:24:44.752060 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4"] Feb 18 06:24:44 crc kubenswrapper[4869]: W0218 06:24:44.757280 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d30bdae_b0f6_49aa_b343_c2b9abc186ba.slice/crio-c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed WatchSource:0}: Error finding container c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed: Status 404 returned error can't find the container with id c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed Feb 18 06:24:45 crc kubenswrapper[4869]: I0218 06:24:45.764273 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" event={"ID":"7d30bdae-b0f6-49aa-b343-c2b9abc186ba","Type":"ContainerStarted","Data":"ac52278627aec5916b0e79a32ca832210f39c6a52db8c922c719b6e77526e09e"} Feb 18 06:24:45 crc kubenswrapper[4869]: I0218 06:24:45.764598 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" event={"ID":"7d30bdae-b0f6-49aa-b343-c2b9abc186ba","Type":"ContainerStarted","Data":"c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed"} Feb 18 06:24:45 crc kubenswrapper[4869]: I0218 06:24:45.787302 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" podStartSLOduration=2.356784288 podStartE2EDuration="2.78728339s" podCreationTimestamp="2026-02-18 06:24:43 +0000 UTC" firstStartedPulling="2026-02-18 06:24:44.760701551 +0000 UTC m=+2181.929789783" lastFinishedPulling="2026-02-18 06:24:45.191200643 +0000 UTC m=+2182.360288885" observedRunningTime="2026-02-18 06:24:45.779329387 +0000 UTC m=+2182.948417629" watchObservedRunningTime="2026-02-18 06:24:45.78728339 +0000 UTC m=+2182.956371612" Feb 18 06:25:10 crc kubenswrapper[4869]: I0218 06:25:10.133065 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:25:10 crc kubenswrapper[4869]: I0218 06:25:10.133707 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.132553 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.133287 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.133362 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.134391 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.134494 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" gracePeriod=600 Feb 18 06:25:40 crc kubenswrapper[4869]: E0218 06:25:40.263917 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.612477 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" exitCode=0 Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.612557 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad"} Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.612906 4869 scope.go:117] "RemoveContainer" containerID="be49bab07a746e2c2d8af3b9281f04db227560a8f1eaeae98cdf0117c41dec9c" Feb 18 06:25:40 crc kubenswrapper[4869]: I0218 06:25:40.613528 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:25:40 crc kubenswrapper[4869]: E0218 06:25:40.613787 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:25:54 crc kubenswrapper[4869]: I0218 06:25:54.469876 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:25:54 crc kubenswrapper[4869]: E0218 06:25:54.470684 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:26:07 crc kubenswrapper[4869]: I0218 06:26:07.470076 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:26:07 crc kubenswrapper[4869]: E0218 06:26:07.471083 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:26:22 crc kubenswrapper[4869]: I0218 06:26:22.471238 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:26:22 crc kubenswrapper[4869]: E0218 06:26:22.472411 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:26:36 crc kubenswrapper[4869]: I0218 06:26:36.470337 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:26:36 crc kubenswrapper[4869]: E0218 06:26:36.471648 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:26:50 crc kubenswrapper[4869]: I0218 06:26:50.470549 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:26:50 crc kubenswrapper[4869]: E0218 06:26:50.471308 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:27:02 crc kubenswrapper[4869]: I0218 06:27:02.470656 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:27:02 crc kubenswrapper[4869]: E0218 06:27:02.471527 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:27:07 crc kubenswrapper[4869]: I0218 06:27:07.380069 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d30bdae-b0f6-49aa-b343-c2b9abc186ba" containerID="ac52278627aec5916b0e79a32ca832210f39c6a52db8c922c719b6e77526e09e" exitCode=0 Feb 18 06:27:07 crc kubenswrapper[4869]: I0218 06:27:07.380544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" event={"ID":"7d30bdae-b0f6-49aa-b343-c2b9abc186ba","Type":"ContainerDied","Data":"ac52278627aec5916b0e79a32ca832210f39c6a52db8c922c719b6e77526e09e"} Feb 18 06:27:08 crc kubenswrapper[4869]: I0218 06:27:08.853196 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004268 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004351 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkmm\" (UniqueName: \"kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004396 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004442 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004465 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004623 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004656 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004732 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004794 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004862 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.004918 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0\") pod \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\" (UID: \"7d30bdae-b0f6-49aa-b343-c2b9abc186ba\") " Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.012162 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.017307 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm" (OuterVolumeSpecName: "kube-api-access-vzkmm") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "kube-api-access-vzkmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.042574 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.045000 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory" (OuterVolumeSpecName: "inventory") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.045049 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.047970 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.056400 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.058588 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.079774 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.082712 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.084159 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7d30bdae-b0f6-49aa-b343-c2b9abc186ba" (UID: "7d30bdae-b0f6-49aa-b343-c2b9abc186ba"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.108552 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109001 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkmm\" (UniqueName: \"kubernetes.io/projected/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-kube-api-access-vzkmm\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109166 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109314 4869 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109506 4869 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109766 4869 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.109924 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.110049 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.110170 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.110327 4869 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.110504 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d30bdae-b0f6-49aa-b343-c2b9abc186ba-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.409141 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" event={"ID":"7d30bdae-b0f6-49aa-b343-c2b9abc186ba","Type":"ContainerDied","Data":"c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed"} Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.409486 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9ab92fb075f721c6791fb5fe98df5671df493ce2be68715ccde226dd1368fed" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.409303 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8p8x4" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.547263 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5"] Feb 18 06:27:09 crc kubenswrapper[4869]: E0218 06:27:09.548121 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d30bdae-b0f6-49aa-b343-c2b9abc186ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.548240 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d30bdae-b0f6-49aa-b343-c2b9abc186ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.548603 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d30bdae-b0f6-49aa-b343-c2b9abc186ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.549800 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.554301 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.554588 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.554713 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.554883 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5vjl5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.554729 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.574168 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5"] Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.724618 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.724839 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.724927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.725030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.725366 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.725461 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.725485 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w545s\" (UniqueName: \"kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.827549 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.827632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w545s\" (UniqueName: \"kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.827786 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.827884 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.827976 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.828025 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.828213 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.833205 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.834481 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.834616 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.835693 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.836084 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.842455 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.860303 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w545s\" (UniqueName: \"kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:09 crc kubenswrapper[4869]: I0218 06:27:09.871275 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:27:10 crc kubenswrapper[4869]: I0218 06:27:10.216915 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5"] Feb 18 06:27:10 crc kubenswrapper[4869]: I0218 06:27:10.420813 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" event={"ID":"bbec6484-4b0d-477a-832a-9fb69ce89f4a","Type":"ContainerStarted","Data":"15dc42102343c6dad66878229947e67651bed915694d2b33f87e116ee0f3a5ff"} Feb 18 06:27:11 crc kubenswrapper[4869]: I0218 06:27:11.434127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" event={"ID":"bbec6484-4b0d-477a-832a-9fb69ce89f4a","Type":"ContainerStarted","Data":"f3a4ae0c5be17106dadf0722277c76c942679e6af93604e265718d86136c9911"} Feb 18 06:27:11 crc kubenswrapper[4869]: I0218 06:27:11.466346 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" podStartSLOduration=2.028957731 podStartE2EDuration="2.466321765s" podCreationTimestamp="2026-02-18 06:27:09 +0000 UTC" firstStartedPulling="2026-02-18 06:27:10.219177833 +0000 UTC m=+2327.388266065" lastFinishedPulling="2026-02-18 06:27:10.656541867 +0000 UTC m=+2327.825630099" observedRunningTime="2026-02-18 06:27:11.460014422 +0000 UTC m=+2328.629102694" watchObservedRunningTime="2026-02-18 06:27:11.466321765 +0000 UTC m=+2328.635409987" Feb 18 06:27:13 crc kubenswrapper[4869]: I0218 06:27:13.482679 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:27:13 crc kubenswrapper[4869]: E0218 06:27:13.483877 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:27:24 crc kubenswrapper[4869]: I0218 06:27:24.470382 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:27:24 crc kubenswrapper[4869]: E0218 06:27:24.471208 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:27:37 crc kubenswrapper[4869]: I0218 06:27:37.470145 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:27:37 crc kubenswrapper[4869]: E0218 06:27:37.471108 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:27:50 crc kubenswrapper[4869]: I0218 06:27:50.470155 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:27:50 crc kubenswrapper[4869]: E0218 06:27:50.470833 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:28:01 crc kubenswrapper[4869]: I0218 06:28:01.471107 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:28:01 crc kubenswrapper[4869]: E0218 06:28:01.472212 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:28:16 crc kubenswrapper[4869]: I0218 06:28:16.470416 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:28:16 crc kubenswrapper[4869]: E0218 06:28:16.471334 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:28:28 crc kubenswrapper[4869]: I0218 06:28:28.470237 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:28:28 crc kubenswrapper[4869]: E0218 06:28:28.471084 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:28:43 crc kubenswrapper[4869]: I0218 06:28:43.476936 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:28:43 crc kubenswrapper[4869]: E0218 06:28:43.478236 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:28:54 crc kubenswrapper[4869]: I0218 06:28:54.469830 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:28:54 crc kubenswrapper[4869]: E0218 06:28:54.472164 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:29:06 crc kubenswrapper[4869]: I0218 06:29:06.469827 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:29:06 crc kubenswrapper[4869]: E0218 06:29:06.470641 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:29:19 crc kubenswrapper[4869]: I0218 06:29:19.470052 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:29:19 crc kubenswrapper[4869]: E0218 06:29:19.470884 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:29:34 crc kubenswrapper[4869]: I0218 06:29:34.470001 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:29:34 crc kubenswrapper[4869]: E0218 06:29:34.470828 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:29:37 crc kubenswrapper[4869]: I0218 06:29:37.688554 4869 generic.go:334] "Generic (PLEG): container finished" podID="bbec6484-4b0d-477a-832a-9fb69ce89f4a" containerID="f3a4ae0c5be17106dadf0722277c76c942679e6af93604e265718d86136c9911" exitCode=0 Feb 18 06:29:37 crc kubenswrapper[4869]: I0218 06:29:37.688650 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" event={"ID":"bbec6484-4b0d-477a-832a-9fb69ce89f4a","Type":"ContainerDied","Data":"f3a4ae0c5be17106dadf0722277c76c942679e6af93604e265718d86136c9911"} Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.096510 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.253503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254616 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254654 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w545s\" (UniqueName: \"kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254708 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254899 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.254961 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0\") pod \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\" (UID: \"bbec6484-4b0d-477a-832a-9fb69ce89f4a\") " Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.261853 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s" (OuterVolumeSpecName: "kube-api-access-w545s") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "kube-api-access-w545s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.274317 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.285352 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.285531 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory" (OuterVolumeSpecName: "inventory") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.286658 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.289990 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.305383 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bbec6484-4b0d-477a-832a-9fb69ce89f4a" (UID: "bbec6484-4b0d-477a-832a-9fb69ce89f4a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356014 4869 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356051 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356063 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356074 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356084 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w545s\" (UniqueName: \"kubernetes.io/projected/bbec6484-4b0d-477a-832a-9fb69ce89f4a-kube-api-access-w545s\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356093 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.356102 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bbec6484-4b0d-477a-832a-9fb69ce89f4a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.705910 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" event={"ID":"bbec6484-4b0d-477a-832a-9fb69ce89f4a","Type":"ContainerDied","Data":"15dc42102343c6dad66878229947e67651bed915694d2b33f87e116ee0f3a5ff"} Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.706226 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15dc42102343c6dad66878229947e67651bed915694d2b33f87e116ee0f3a5ff" Feb 18 06:29:39 crc kubenswrapper[4869]: I0218 06:29:39.705975 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5" Feb 18 06:29:47 crc kubenswrapper[4869]: I0218 06:29:47.471118 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:29:47 crc kubenswrapper[4869]: E0218 06:29:47.471910 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:29:59 crc kubenswrapper[4869]: I0218 06:29:59.471019 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:29:59 crc kubenswrapper[4869]: E0218 06:29:59.472264 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.152296 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r"] Feb 18 06:30:00 crc kubenswrapper[4869]: E0218 06:30:00.152708 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbec6484-4b0d-477a-832a-9fb69ce89f4a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.152725 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbec6484-4b0d-477a-832a-9fb69ce89f4a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.152927 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbec6484-4b0d-477a-832a-9fb69ce89f4a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.153520 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.156622 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.163383 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.169350 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r"] Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.250862 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8z7\" (UniqueName: \"kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.250928 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.251401 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.353508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8z7\" (UniqueName: \"kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.353829 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.354050 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.354906 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.359945 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.369757 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8z7\" (UniqueName: \"kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7\") pod \"collect-profiles-29523270-7879r\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.484519 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:00 crc kubenswrapper[4869]: I0218 06:30:00.915439 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r"] Feb 18 06:30:01 crc kubenswrapper[4869]: I0218 06:30:01.890779 4869 generic.go:334] "Generic (PLEG): container finished" podID="f410b838-dda4-4d40-a078-e2817ec8ee7b" containerID="fe34658a8ce110a3480586f69ac85da2b15bf5d13c8a16c1f4438ddf19588447" exitCode=0 Feb 18 06:30:01 crc kubenswrapper[4869]: I0218 06:30:01.890861 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" event={"ID":"f410b838-dda4-4d40-a078-e2817ec8ee7b","Type":"ContainerDied","Data":"fe34658a8ce110a3480586f69ac85da2b15bf5d13c8a16c1f4438ddf19588447"} Feb 18 06:30:01 crc kubenswrapper[4869]: I0218 06:30:01.891071 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" event={"ID":"f410b838-dda4-4d40-a078-e2817ec8ee7b","Type":"ContainerStarted","Data":"fb73d917bf39e07619ec994ff50200b3bf333b650359bab4c3f9c709499595fb"} Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.188573 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.315593 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8z7\" (UniqueName: \"kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7\") pod \"f410b838-dda4-4d40-a078-e2817ec8ee7b\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.315990 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume\") pod \"f410b838-dda4-4d40-a078-e2817ec8ee7b\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.316152 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume\") pod \"f410b838-dda4-4d40-a078-e2817ec8ee7b\" (UID: \"f410b838-dda4-4d40-a078-e2817ec8ee7b\") " Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.316688 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume" (OuterVolumeSpecName: "config-volume") pod "f410b838-dda4-4d40-a078-e2817ec8ee7b" (UID: "f410b838-dda4-4d40-a078-e2817ec8ee7b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.322264 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f410b838-dda4-4d40-a078-e2817ec8ee7b" (UID: "f410b838-dda4-4d40-a078-e2817ec8ee7b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.322475 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7" (OuterVolumeSpecName: "kube-api-access-sj8z7") pod "f410b838-dda4-4d40-a078-e2817ec8ee7b" (UID: "f410b838-dda4-4d40-a078-e2817ec8ee7b"). InnerVolumeSpecName "kube-api-access-sj8z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.427346 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f410b838-dda4-4d40-a078-e2817ec8ee7b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.427393 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f410b838-dda4-4d40-a078-e2817ec8ee7b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.427406 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8z7\" (UniqueName: \"kubernetes.io/projected/f410b838-dda4-4d40-a078-e2817ec8ee7b-kube-api-access-sj8z7\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.909694 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" event={"ID":"f410b838-dda4-4d40-a078-e2817ec8ee7b","Type":"ContainerDied","Data":"fb73d917bf39e07619ec994ff50200b3bf333b650359bab4c3f9c709499595fb"} Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.910082 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb73d917bf39e07619ec994ff50200b3bf333b650359bab4c3f9c709499595fb" Feb 18 06:30:03 crc kubenswrapper[4869]: I0218 06:30:03.909880 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-7879r" Feb 18 06:30:04 crc kubenswrapper[4869]: I0218 06:30:04.272141 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97"] Feb 18 06:30:04 crc kubenswrapper[4869]: I0218 06:30:04.284502 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-j2h97"] Feb 18 06:30:05 crc kubenswrapper[4869]: I0218 06:30:05.480154 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf" path="/var/lib/kubelet/pods/c12a0321-b9c2-46c5-b1e4-70f3ef43c1bf/volumes" Feb 18 06:30:11 crc kubenswrapper[4869]: I0218 06:30:11.470776 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:30:11 crc kubenswrapper[4869]: E0218 06:30:11.471460 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.543913 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:30:21 crc kubenswrapper[4869]: E0218 06:30:21.544916 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f410b838-dda4-4d40-a078-e2817ec8ee7b" containerName="collect-profiles" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.544932 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f410b838-dda4-4d40-a078-e2817ec8ee7b" containerName="collect-profiles" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.545229 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f410b838-dda4-4d40-a078-e2817ec8ee7b" containerName="collect-profiles" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.546035 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.549492 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.549646 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.549881 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6ts9p" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.551819 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.561271 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.629632 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.629816 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.630411 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.630927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.631168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.631259 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.631445 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2k5\" (UniqueName: \"kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.631531 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.631792 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733066 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733130 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733159 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733206 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2k5\" (UniqueName: \"kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733235 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733293 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733340 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.733384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.734193 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.734433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.734667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.735091 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.736714 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.740722 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.742105 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.747377 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.754558 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2k5\" (UniqueName: \"kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.771408 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " pod="openstack/tempest-tests-tempest" Feb 18 06:30:21 crc kubenswrapper[4869]: I0218 06:30:21.862683 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:30:22 crc kubenswrapper[4869]: I0218 06:30:22.339710 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:30:22 crc kubenswrapper[4869]: I0218 06:30:22.353879 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:30:23 crc kubenswrapper[4869]: I0218 06:30:23.079462 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4450687d-212c-4577-9511-05a7f072b274","Type":"ContainerStarted","Data":"3c5c686cc0beb881c833427e85ac6704e47c230c3f2cd4cf93a87f6b56acdb15"} Feb 18 06:30:25 crc kubenswrapper[4869]: I0218 06:30:25.474164 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:30:25 crc kubenswrapper[4869]: E0218 06:30:25.476458 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.327534 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.330249 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.341139 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.446086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s946\" (UniqueName: \"kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.446166 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.446189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.548198 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s946\" (UniqueName: \"kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.548261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.548276 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.549239 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.549421 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.570707 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s946\" (UniqueName: \"kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946\") pod \"redhat-marketplace-pkrqf\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:26 crc kubenswrapper[4869]: I0218 06:30:26.658486 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:28 crc kubenswrapper[4869]: I0218 06:30:28.063882 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:30:28 crc kubenswrapper[4869]: I0218 06:30:28.292373 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerStarted","Data":"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08"} Feb 18 06:30:28 crc kubenswrapper[4869]: I0218 06:30:28.292634 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerStarted","Data":"ef28abeddf0a42304df79c0a9e477e83bd6ac952df8600d2073c844fd78e65a3"} Feb 18 06:30:29 crc kubenswrapper[4869]: I0218 06:30:29.302327 4869 generic.go:334] "Generic (PLEG): container finished" podID="f246250a-7946-47e3-96c5-e4c071e022fa" containerID="b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08" exitCode=0 Feb 18 06:30:29 crc kubenswrapper[4869]: I0218 06:30:29.302375 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerDied","Data":"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08"} Feb 18 06:30:30 crc kubenswrapper[4869]: I0218 06:30:30.491023 4869 scope.go:117] "RemoveContainer" containerID="e7ba3a4c09402694d3eb14f70c7a6a424db9ddc92e4597ffdcc385fa50203789" Feb 18 06:30:31 crc kubenswrapper[4869]: I0218 06:30:31.322726 4869 generic.go:334] "Generic (PLEG): container finished" podID="f246250a-7946-47e3-96c5-e4c071e022fa" containerID="7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce" exitCode=0 Feb 18 06:30:31 crc kubenswrapper[4869]: I0218 06:30:31.322876 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerDied","Data":"7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce"} Feb 18 06:30:36 crc kubenswrapper[4869]: I0218 06:30:36.469875 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:30:36 crc kubenswrapper[4869]: E0218 06:30:36.470619 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:30:50 crc kubenswrapper[4869]: I0218 06:30:50.470720 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:30:51 crc kubenswrapper[4869]: E0218 06:30:51.419129 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 18 06:30:51 crc kubenswrapper[4869]: E0218 06:30:51.419562 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl2k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4450687d-212c-4577-9511-05a7f072b274): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:30:51 crc kubenswrapper[4869]: E0218 06:30:51.420737 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4450687d-212c-4577-9511-05a7f072b274" Feb 18 06:30:51 crc kubenswrapper[4869]: I0218 06:30:51.519275 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerStarted","Data":"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be"} Feb 18 06:30:51 crc kubenswrapper[4869]: I0218 06:30:51.526228 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0"} Feb 18 06:30:51 crc kubenswrapper[4869]: E0218 06:30:51.527990 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4450687d-212c-4577-9511-05a7f072b274" Feb 18 06:30:51 crc kubenswrapper[4869]: I0218 06:30:51.548434 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pkrqf" podStartSLOduration=21.036797437 podStartE2EDuration="25.548410905s" podCreationTimestamp="2026-02-18 06:30:26 +0000 UTC" firstStartedPulling="2026-02-18 06:30:29.304152244 +0000 UTC m=+2526.473240476" lastFinishedPulling="2026-02-18 06:30:33.815765692 +0000 UTC m=+2530.984853944" observedRunningTime="2026-02-18 06:30:51.542037642 +0000 UTC m=+2548.711125964" watchObservedRunningTime="2026-02-18 06:30:51.548410905 +0000 UTC m=+2548.717499147" Feb 18 06:30:56 crc kubenswrapper[4869]: I0218 06:30:56.660156 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:56 crc kubenswrapper[4869]: I0218 06:30:56.661924 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:30:57 crc kubenswrapper[4869]: I0218 06:30:57.708479 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pkrqf" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="registry-server" probeResult="failure" output=< Feb 18 06:30:57 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:30:57 crc kubenswrapper[4869]: > Feb 18 06:31:06 crc kubenswrapper[4869]: I0218 06:31:06.735126 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:31:06 crc kubenswrapper[4869]: I0218 06:31:06.784834 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:31:06 crc kubenswrapper[4869]: I0218 06:31:06.906217 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 06:31:06 crc kubenswrapper[4869]: I0218 06:31:06.970960 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:31:08 crc kubenswrapper[4869]: I0218 06:31:08.738235 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pkrqf" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="registry-server" containerID="cri-o://b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be" gracePeriod=2 Feb 18 06:31:08 crc kubenswrapper[4869]: I0218 06:31:08.739239 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4450687d-212c-4577-9511-05a7f072b274","Type":"ContainerStarted","Data":"f558503deb74af893a9669958490c6a8cbb416c1d93dd1fdfac0ccbc26bb3770"} Feb 18 06:31:08 crc kubenswrapper[4869]: I0218 06:31:08.763590 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.21350902 podStartE2EDuration="48.763553234s" podCreationTimestamp="2026-02-18 06:30:20 +0000 UTC" firstStartedPulling="2026-02-18 06:30:22.353565967 +0000 UTC m=+2519.522654229" lastFinishedPulling="2026-02-18 06:31:06.903610211 +0000 UTC m=+2564.072698443" observedRunningTime="2026-02-18 06:31:08.75759793 +0000 UTC m=+2565.926686202" watchObservedRunningTime="2026-02-18 06:31:08.763553234 +0000 UTC m=+2565.932641506" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.393176 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.578392 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s946\" (UniqueName: \"kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946\") pod \"f246250a-7946-47e3-96c5-e4c071e022fa\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.578515 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content\") pod \"f246250a-7946-47e3-96c5-e4c071e022fa\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.578704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities\") pod \"f246250a-7946-47e3-96c5-e4c071e022fa\" (UID: \"f246250a-7946-47e3-96c5-e4c071e022fa\") " Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.579648 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities" (OuterVolumeSpecName: "utilities") pod "f246250a-7946-47e3-96c5-e4c071e022fa" (UID: "f246250a-7946-47e3-96c5-e4c071e022fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.585553 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946" (OuterVolumeSpecName: "kube-api-access-4s946") pod "f246250a-7946-47e3-96c5-e4c071e022fa" (UID: "f246250a-7946-47e3-96c5-e4c071e022fa"). InnerVolumeSpecName "kube-api-access-4s946". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.608438 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f246250a-7946-47e3-96c5-e4c071e022fa" (UID: "f246250a-7946-47e3-96c5-e4c071e022fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.681318 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.681364 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s946\" (UniqueName: \"kubernetes.io/projected/f246250a-7946-47e3-96c5-e4c071e022fa-kube-api-access-4s946\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.681375 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f246250a-7946-47e3-96c5-e4c071e022fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.748374 4869 generic.go:334] "Generic (PLEG): container finished" podID="f246250a-7946-47e3-96c5-e4c071e022fa" containerID="b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be" exitCode=0 Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.748419 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerDied","Data":"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be"} Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.748443 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkrqf" event={"ID":"f246250a-7946-47e3-96c5-e4c071e022fa","Type":"ContainerDied","Data":"ef28abeddf0a42304df79c0a9e477e83bd6ac952df8600d2073c844fd78e65a3"} Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.748462 4869 scope.go:117] "RemoveContainer" containerID="b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.748578 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkrqf" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.797955 4869 scope.go:117] "RemoveContainer" containerID="7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.799218 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.808234 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkrqf"] Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.814188 4869 scope.go:117] "RemoveContainer" containerID="b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.860041 4869 scope.go:117] "RemoveContainer" containerID="b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be" Feb 18 06:31:09 crc kubenswrapper[4869]: E0218 06:31:09.860704 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be\": container with ID starting with b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be not found: ID does not exist" containerID="b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.860761 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be"} err="failed to get container status \"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be\": rpc error: code = NotFound desc = could not find container \"b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be\": container with ID starting with b411f5952938b5085bca9b8a20c5150ca33520bdd9fd29b346add98db00e94be not found: ID does not exist" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.860787 4869 scope.go:117] "RemoveContainer" containerID="7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce" Feb 18 06:31:09 crc kubenswrapper[4869]: E0218 06:31:09.861169 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce\": container with ID starting with 7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce not found: ID does not exist" containerID="7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.861222 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce"} err="failed to get container status \"7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce\": rpc error: code = NotFound desc = could not find container \"7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce\": container with ID starting with 7c2b113d45d3fbf49999782b16716bf3b016adc07764d7737a05ec1bb88a04ce not found: ID does not exist" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.861259 4869 scope.go:117] "RemoveContainer" containerID="b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08" Feb 18 06:31:09 crc kubenswrapper[4869]: E0218 06:31:09.861604 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08\": container with ID starting with b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08 not found: ID does not exist" containerID="b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08" Feb 18 06:31:09 crc kubenswrapper[4869]: I0218 06:31:09.861651 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08"} err="failed to get container status \"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08\": rpc error: code = NotFound desc = could not find container \"b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08\": container with ID starting with b75f3ef65cf1443bb842dc4c8917a816572054bff487435b838f639f6fa37e08 not found: ID does not exist" Feb 18 06:31:11 crc kubenswrapper[4869]: I0218 06:31:11.483528 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" path="/var/lib/kubelet/pods/f246250a-7946-47e3-96c5-e4c071e022fa/volumes" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.660323 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:12 crc kubenswrapper[4869]: E0218 06:31:12.665891 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="extract-utilities" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.665928 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="extract-utilities" Feb 18 06:31:12 crc kubenswrapper[4869]: E0218 06:31:12.665974 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="registry-server" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.665987 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="registry-server" Feb 18 06:31:12 crc kubenswrapper[4869]: E0218 06:31:12.666028 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="extract-content" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.666041 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="extract-content" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.666696 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f246250a-7946-47e3-96c5-e4c071e022fa" containerName="registry-server" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.685902 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.687434 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.858166 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.858562 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmkt\" (UniqueName: \"kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.858686 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.961918 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.962086 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmkt\" (UniqueName: \"kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.962113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.962712 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.962880 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:12 crc kubenswrapper[4869]: I0218 06:31:12.995586 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmkt\" (UniqueName: \"kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt\") pod \"redhat-operators-xbms2\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:13 crc kubenswrapper[4869]: I0218 06:31:13.018882 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:13 crc kubenswrapper[4869]: I0218 06:31:13.553902 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:13 crc kubenswrapper[4869]: I0218 06:31:13.791311 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerStarted","Data":"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1"} Feb 18 06:31:13 crc kubenswrapper[4869]: I0218 06:31:13.791792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerStarted","Data":"3400b36702e11c87a9f10db3de54b3218ee9e3513a40050da4b0708784000cba"} Feb 18 06:31:14 crc kubenswrapper[4869]: I0218 06:31:14.806166 4869 generic.go:334] "Generic (PLEG): container finished" podID="368744c9-f5ef-4bab-9896-c809efd599a1" containerID="36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1" exitCode=0 Feb 18 06:31:14 crc kubenswrapper[4869]: I0218 06:31:14.806243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerDied","Data":"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1"} Feb 18 06:31:14 crc kubenswrapper[4869]: I0218 06:31:14.806588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerStarted","Data":"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db"} Feb 18 06:31:17 crc kubenswrapper[4869]: I0218 06:31:17.874195 4869 generic.go:334] "Generic (PLEG): container finished" podID="368744c9-f5ef-4bab-9896-c809efd599a1" containerID="bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db" exitCode=0 Feb 18 06:31:17 crc kubenswrapper[4869]: I0218 06:31:17.874273 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerDied","Data":"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db"} Feb 18 06:31:18 crc kubenswrapper[4869]: I0218 06:31:18.885579 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerStarted","Data":"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5"} Feb 18 06:31:18 crc kubenswrapper[4869]: I0218 06:31:18.907818 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbms2" podStartSLOduration=2.453996565 podStartE2EDuration="6.907794932s" podCreationTimestamp="2026-02-18 06:31:12 +0000 UTC" firstStartedPulling="2026-02-18 06:31:13.793342365 +0000 UTC m=+2570.962430597" lastFinishedPulling="2026-02-18 06:31:18.247140732 +0000 UTC m=+2575.416228964" observedRunningTime="2026-02-18 06:31:18.902449803 +0000 UTC m=+2576.071538085" watchObservedRunningTime="2026-02-18 06:31:18.907794932 +0000 UTC m=+2576.076883204" Feb 18 06:31:23 crc kubenswrapper[4869]: I0218 06:31:23.019858 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:23 crc kubenswrapper[4869]: I0218 06:31:23.020444 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:24 crc kubenswrapper[4869]: I0218 06:31:24.084228 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbms2" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="registry-server" probeResult="failure" output=< Feb 18 06:31:24 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:31:24 crc kubenswrapper[4869]: > Feb 18 06:31:33 crc kubenswrapper[4869]: I0218 06:31:33.103916 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:33 crc kubenswrapper[4869]: I0218 06:31:33.161926 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:33 crc kubenswrapper[4869]: I0218 06:31:33.365618 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.053046 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbms2" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="registry-server" containerID="cri-o://489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5" gracePeriod=2 Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.526881 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.717091 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhmkt\" (UniqueName: \"kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt\") pod \"368744c9-f5ef-4bab-9896-c809efd599a1\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.717177 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities\") pod \"368744c9-f5ef-4bab-9896-c809efd599a1\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.717224 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content\") pod \"368744c9-f5ef-4bab-9896-c809efd599a1\" (UID: \"368744c9-f5ef-4bab-9896-c809efd599a1\") " Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.717962 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities" (OuterVolumeSpecName: "utilities") pod "368744c9-f5ef-4bab-9896-c809efd599a1" (UID: "368744c9-f5ef-4bab-9896-c809efd599a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.723591 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt" (OuterVolumeSpecName: "kube-api-access-zhmkt") pod "368744c9-f5ef-4bab-9896-c809efd599a1" (UID: "368744c9-f5ef-4bab-9896-c809efd599a1"). InnerVolumeSpecName "kube-api-access-zhmkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.819593 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhmkt\" (UniqueName: \"kubernetes.io/projected/368744c9-f5ef-4bab-9896-c809efd599a1-kube-api-access-zhmkt\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.819856 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.838030 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "368744c9-f5ef-4bab-9896-c809efd599a1" (UID: "368744c9-f5ef-4bab-9896-c809efd599a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:31:35 crc kubenswrapper[4869]: I0218 06:31:35.921483 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368744c9-f5ef-4bab-9896-c809efd599a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.063407 4869 generic.go:334] "Generic (PLEG): container finished" podID="368744c9-f5ef-4bab-9896-c809efd599a1" containerID="489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5" exitCode=0 Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.063453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerDied","Data":"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5"} Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.063478 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbms2" event={"ID":"368744c9-f5ef-4bab-9896-c809efd599a1","Type":"ContainerDied","Data":"3400b36702e11c87a9f10db3de54b3218ee9e3513a40050da4b0708784000cba"} Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.063497 4869 scope.go:117] "RemoveContainer" containerID="489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.064591 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbms2" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.084165 4869 scope.go:117] "RemoveContainer" containerID="bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.099927 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.107696 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbms2"] Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.144865 4869 scope.go:117] "RemoveContainer" containerID="36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.167973 4869 scope.go:117] "RemoveContainer" containerID="489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5" Feb 18 06:31:36 crc kubenswrapper[4869]: E0218 06:31:36.169110 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5\": container with ID starting with 489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5 not found: ID does not exist" containerID="489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.169136 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5"} err="failed to get container status \"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5\": rpc error: code = NotFound desc = could not find container \"489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5\": container with ID starting with 489fae11de652b0a15d3a4f831b92e3e08df8e7c0d5f9ecc8744b59178317df5 not found: ID does not exist" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.169155 4869 scope.go:117] "RemoveContainer" containerID="bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db" Feb 18 06:31:36 crc kubenswrapper[4869]: E0218 06:31:36.169459 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db\": container with ID starting with bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db not found: ID does not exist" containerID="bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.169481 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db"} err="failed to get container status \"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db\": rpc error: code = NotFound desc = could not find container \"bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db\": container with ID starting with bb1610881fad09c01fffe5a82722f8b74d89193aff02d7414009843cd9aa83db not found: ID does not exist" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.169493 4869 scope.go:117] "RemoveContainer" containerID="36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1" Feb 18 06:31:36 crc kubenswrapper[4869]: E0218 06:31:36.169760 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1\": container with ID starting with 36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1 not found: ID does not exist" containerID="36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1" Feb 18 06:31:36 crc kubenswrapper[4869]: I0218 06:31:36.169780 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1"} err="failed to get container status \"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1\": rpc error: code = NotFound desc = could not find container \"36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1\": container with ID starting with 36fd7011d1d6724dbab565fd3959f40c0fb535255f41d5a3041a3a3c19b9e9a1 not found: ID does not exist" Feb 18 06:31:37 crc kubenswrapper[4869]: I0218 06:31:37.485490 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" path="/var/lib/kubelet/pods/368744c9-f5ef-4bab-9896-c809efd599a1/volumes" Feb 18 06:33:10 crc kubenswrapper[4869]: I0218 06:33:10.133282 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:33:10 crc kubenswrapper[4869]: I0218 06:33:10.133907 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:33:40 crc kubenswrapper[4869]: I0218 06:33:40.132558 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:33:40 crc kubenswrapper[4869]: I0218 06:33:40.133167 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:34:10 crc kubenswrapper[4869]: I0218 06:34:10.132844 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:34:10 crc kubenswrapper[4869]: I0218 06:34:10.133428 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:34:10 crc kubenswrapper[4869]: I0218 06:34:10.133477 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:34:10 crc kubenswrapper[4869]: I0218 06:34:10.134242 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:34:10 crc kubenswrapper[4869]: I0218 06:34:10.134299 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0" gracePeriod=600 Feb 18 06:34:11 crc kubenswrapper[4869]: I0218 06:34:11.557824 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0" exitCode=0 Feb 18 06:34:11 crc kubenswrapper[4869]: I0218 06:34:11.557898 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0"} Feb 18 06:34:11 crc kubenswrapper[4869]: I0218 06:34:11.558443 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6"} Feb 18 06:34:11 crc kubenswrapper[4869]: I0218 06:34:11.558476 4869 scope.go:117] "RemoveContainer" containerID="1afd4b32e3650be9b4e9bc351073861614c8e7b5694cf29bc6439d2989d348ad" Feb 18 06:36:10 crc kubenswrapper[4869]: I0218 06:36:10.132593 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:36:10 crc kubenswrapper[4869]: I0218 06:36:10.133415 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:36:40 crc kubenswrapper[4869]: I0218 06:36:40.132462 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:36:40 crc kubenswrapper[4869]: I0218 06:36:40.133197 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:37:10 crc kubenswrapper[4869]: I0218 06:37:10.133193 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:37:10 crc kubenswrapper[4869]: I0218 06:37:10.134351 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:37:10 crc kubenswrapper[4869]: I0218 06:37:10.134453 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:37:10 crc kubenswrapper[4869]: I0218 06:37:10.136248 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:37:10 crc kubenswrapper[4869]: I0218 06:37:10.136413 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" gracePeriod=600 Feb 18 06:37:10 crc kubenswrapper[4869]: E0218 06:37:10.259910 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:37:11 crc kubenswrapper[4869]: I0218 06:37:11.179786 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" exitCode=0 Feb 18 06:37:11 crc kubenswrapper[4869]: I0218 06:37:11.179826 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6"} Feb 18 06:37:11 crc kubenswrapper[4869]: I0218 06:37:11.179859 4869 scope.go:117] "RemoveContainer" containerID="45d6d1d3801d55e94174b9f3903d1e89f908ddba36891afe044a8bac1a4b58b0" Feb 18 06:37:11 crc kubenswrapper[4869]: I0218 06:37:11.180419 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:37:11 crc kubenswrapper[4869]: E0218 06:37:11.180666 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:37:24 crc kubenswrapper[4869]: I0218 06:37:24.470851 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:37:24 crc kubenswrapper[4869]: E0218 06:37:24.472124 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:37:37 crc kubenswrapper[4869]: I0218 06:37:37.470852 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:37:37 crc kubenswrapper[4869]: E0218 06:37:37.471784 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:37:50 crc kubenswrapper[4869]: I0218 06:37:50.470789 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:37:50 crc kubenswrapper[4869]: E0218 06:37:50.471653 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:38:04 crc kubenswrapper[4869]: I0218 06:38:04.469936 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:38:04 crc kubenswrapper[4869]: E0218 06:38:04.470797 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:38:19 crc kubenswrapper[4869]: I0218 06:38:19.469811 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:38:19 crc kubenswrapper[4869]: E0218 06:38:19.470640 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:38:34 crc kubenswrapper[4869]: I0218 06:38:34.470329 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:38:34 crc kubenswrapper[4869]: E0218 06:38:34.471624 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:38:48 crc kubenswrapper[4869]: I0218 06:38:48.470576 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:38:48 crc kubenswrapper[4869]: E0218 06:38:48.471685 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:39:03 crc kubenswrapper[4869]: I0218 06:39:03.479816 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:39:03 crc kubenswrapper[4869]: E0218 06:39:03.480874 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:39:15 crc kubenswrapper[4869]: I0218 06:39:15.470067 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:39:15 crc kubenswrapper[4869]: E0218 06:39:15.471343 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:39:26 crc kubenswrapper[4869]: I0218 06:39:26.470339 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:39:26 crc kubenswrapper[4869]: E0218 06:39:26.471149 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:39:41 crc kubenswrapper[4869]: I0218 06:39:41.470765 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:39:41 crc kubenswrapper[4869]: E0218 06:39:41.472146 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:39:56 crc kubenswrapper[4869]: I0218 06:39:56.470504 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:39:56 crc kubenswrapper[4869]: E0218 06:39:56.472557 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:40:08 crc kubenswrapper[4869]: I0218 06:40:08.471590 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:40:08 crc kubenswrapper[4869]: E0218 06:40:08.472533 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.885337 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:09 crc kubenswrapper[4869]: E0218 06:40:09.885830 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="extract-content" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.885849 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="extract-content" Feb 18 06:40:09 crc kubenswrapper[4869]: E0218 06:40:09.885872 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="extract-utilities" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.885881 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="extract-utilities" Feb 18 06:40:09 crc kubenswrapper[4869]: E0218 06:40:09.885900 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="registry-server" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.885908 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="registry-server" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.886253 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="368744c9-f5ef-4bab-9896-c809efd599a1" containerName="registry-server" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.888071 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:09 crc kubenswrapper[4869]: I0218 06:40:09.907417 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.066181 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.066263 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5hg9\" (UniqueName: \"kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.066359 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.168253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.168567 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5hg9\" (UniqueName: \"kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.168631 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.168813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.169122 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.198739 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5hg9\" (UniqueName: \"kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9\") pod \"community-operators-6jt6c\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.231578 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:10 crc kubenswrapper[4869]: I0218 06:40:10.880220 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:11 crc kubenswrapper[4869]: I0218 06:40:11.892480 4869 generic.go:334] "Generic (PLEG): container finished" podID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerID="bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d" exitCode=0 Feb 18 06:40:11 crc kubenswrapper[4869]: I0218 06:40:11.892562 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerDied","Data":"bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d"} Feb 18 06:40:11 crc kubenswrapper[4869]: I0218 06:40:11.894373 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerStarted","Data":"80abe1ff79164bb47ff7c254228706d9bc78a2d941a7e71d92ffa179ae1c736d"} Feb 18 06:40:11 crc kubenswrapper[4869]: I0218 06:40:11.895902 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:40:12 crc kubenswrapper[4869]: I0218 06:40:12.905691 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerStarted","Data":"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e"} Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.060852 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.062797 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.077050 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.234957 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.235085 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.235116 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z894x\" (UniqueName: \"kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.337056 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.337098 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z894x\" (UniqueName: \"kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.337210 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.337513 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.337542 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.358715 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z894x\" (UniqueName: \"kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x\") pod \"certified-operators-t7vfx\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.400967 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.867487 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.930425 4869 generic.go:334] "Generic (PLEG): container finished" podID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerID="bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e" exitCode=0 Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.930499 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerDied","Data":"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e"} Feb 18 06:40:13 crc kubenswrapper[4869]: I0218 06:40:13.940361 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerStarted","Data":"c5cfd4708570a610054917a258599b0bfe081db034ee27280c03329b6741a5e2"} Feb 18 06:40:14 crc kubenswrapper[4869]: I0218 06:40:14.950857 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerStarted","Data":"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af"} Feb 18 06:40:14 crc kubenswrapper[4869]: I0218 06:40:14.952717 4869 generic.go:334] "Generic (PLEG): container finished" podID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerID="d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc" exitCode=0 Feb 18 06:40:14 crc kubenswrapper[4869]: I0218 06:40:14.952785 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerDied","Data":"d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc"} Feb 18 06:40:14 crc kubenswrapper[4869]: I0218 06:40:14.981721 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6jt6c" podStartSLOduration=3.524154979 podStartE2EDuration="5.98169401s" podCreationTimestamp="2026-02-18 06:40:09 +0000 UTC" firstStartedPulling="2026-02-18 06:40:11.895594546 +0000 UTC m=+3109.064682788" lastFinishedPulling="2026-02-18 06:40:14.353133537 +0000 UTC m=+3111.522221819" observedRunningTime="2026-02-18 06:40:14.978654707 +0000 UTC m=+3112.147742939" watchObservedRunningTime="2026-02-18 06:40:14.98169401 +0000 UTC m=+3112.150782262" Feb 18 06:40:15 crc kubenswrapper[4869]: I0218 06:40:15.963020 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerStarted","Data":"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220"} Feb 18 06:40:17 crc kubenswrapper[4869]: I0218 06:40:17.980051 4869 generic.go:334] "Generic (PLEG): container finished" podID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerID="b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220" exitCode=0 Feb 18 06:40:17 crc kubenswrapper[4869]: I0218 06:40:17.980130 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerDied","Data":"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220"} Feb 18 06:40:18 crc kubenswrapper[4869]: I0218 06:40:18.993099 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerStarted","Data":"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f"} Feb 18 06:40:19 crc kubenswrapper[4869]: I0218 06:40:19.026125 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t7vfx" podStartSLOduration=2.392413792 podStartE2EDuration="6.026104336s" podCreationTimestamp="2026-02-18 06:40:13 +0000 UTC" firstStartedPulling="2026-02-18 06:40:14.954331673 +0000 UTC m=+3112.123419905" lastFinishedPulling="2026-02-18 06:40:18.588022217 +0000 UTC m=+3115.757110449" observedRunningTime="2026-02-18 06:40:19.017218272 +0000 UTC m=+3116.186306514" watchObservedRunningTime="2026-02-18 06:40:19.026104336 +0000 UTC m=+3116.195192578" Feb 18 06:40:20 crc kubenswrapper[4869]: I0218 06:40:20.233019 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:20 crc kubenswrapper[4869]: I0218 06:40:20.233068 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:20 crc kubenswrapper[4869]: I0218 06:40:20.281475 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:21 crc kubenswrapper[4869]: I0218 06:40:21.051614 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:23 crc kubenswrapper[4869]: I0218 06:40:23.401804 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:23 crc kubenswrapper[4869]: I0218 06:40:23.402348 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:23 crc kubenswrapper[4869]: I0218 06:40:23.460962 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:23 crc kubenswrapper[4869]: I0218 06:40:23.470295 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:40:23 crc kubenswrapper[4869]: E0218 06:40:23.470551 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.111247 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.251106 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.251363 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6jt6c" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="registry-server" containerID="cri-o://5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af" gracePeriod=2 Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.740702 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.882604 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities\") pod \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.882831 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content\") pod \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.882858 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5hg9\" (UniqueName: \"kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9\") pod \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\" (UID: \"3d70f15b-ac2e-467c-b06e-8bbdf0055651\") " Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.883982 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities" (OuterVolumeSpecName: "utilities") pod "3d70f15b-ac2e-467c-b06e-8bbdf0055651" (UID: "3d70f15b-ac2e-467c-b06e-8bbdf0055651"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.895276 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9" (OuterVolumeSpecName: "kube-api-access-d5hg9") pod "3d70f15b-ac2e-467c-b06e-8bbdf0055651" (UID: "3d70f15b-ac2e-467c-b06e-8bbdf0055651"). InnerVolumeSpecName "kube-api-access-d5hg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.929875 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d70f15b-ac2e-467c-b06e-8bbdf0055651" (UID: "3d70f15b-ac2e-467c-b06e-8bbdf0055651"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.984706 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.984740 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5hg9\" (UniqueName: \"kubernetes.io/projected/3d70f15b-ac2e-467c-b06e-8bbdf0055651-kube-api-access-d5hg9\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:24 crc kubenswrapper[4869]: I0218 06:40:24.984785 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d70f15b-ac2e-467c-b06e-8bbdf0055651-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.050256 4869 generic.go:334] "Generic (PLEG): container finished" podID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerID="5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af" exitCode=0 Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.050840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerDied","Data":"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af"} Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.050881 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jt6c" event={"ID":"3d70f15b-ac2e-467c-b06e-8bbdf0055651","Type":"ContainerDied","Data":"80abe1ff79164bb47ff7c254228706d9bc78a2d941a7e71d92ffa179ae1c736d"} Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.050905 4869 scope.go:117] "RemoveContainer" containerID="5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.050931 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jt6c" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.073622 4869 scope.go:117] "RemoveContainer" containerID="bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.088431 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.096661 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6jt6c"] Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.106730 4869 scope.go:117] "RemoveContainer" containerID="bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.138598 4869 scope.go:117] "RemoveContainer" containerID="5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af" Feb 18 06:40:25 crc kubenswrapper[4869]: E0218 06:40:25.139052 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af\": container with ID starting with 5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af not found: ID does not exist" containerID="5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.139092 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af"} err="failed to get container status \"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af\": rpc error: code = NotFound desc = could not find container \"5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af\": container with ID starting with 5397a1088327d4937caf461ceebff315cf712e073951a03257b46c3f1402b9af not found: ID does not exist" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.139114 4869 scope.go:117] "RemoveContainer" containerID="bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e" Feb 18 06:40:25 crc kubenswrapper[4869]: E0218 06:40:25.139512 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e\": container with ID starting with bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e not found: ID does not exist" containerID="bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.139538 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e"} err="failed to get container status \"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e\": rpc error: code = NotFound desc = could not find container \"bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e\": container with ID starting with bf49c999400002dca2c5b293f5fbde973dbbcdfb6371ece974f463fba66b867e not found: ID does not exist" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.139554 4869 scope.go:117] "RemoveContainer" containerID="bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d" Feb 18 06:40:25 crc kubenswrapper[4869]: E0218 06:40:25.140120 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d\": container with ID starting with bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d not found: ID does not exist" containerID="bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.140180 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d"} err="failed to get container status \"bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d\": rpc error: code = NotFound desc = could not find container \"bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d\": container with ID starting with bc56eaaf9aed2f510e510e7970fe5e92d81c9ff7dbbd7c77e5e00fc2af00696d not found: ID does not exist" Feb 18 06:40:25 crc kubenswrapper[4869]: I0218 06:40:25.480562 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" path="/var/lib/kubelet/pods/3d70f15b-ac2e-467c-b06e-8bbdf0055651/volumes" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.054445 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.058666 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t7vfx" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="registry-server" containerID="cri-o://d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f" gracePeriod=2 Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.525521 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.715737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z894x\" (UniqueName: \"kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x\") pod \"65c6bb33-1df9-446b-8589-99c1bc159f0b\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.716024 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content\") pod \"65c6bb33-1df9-446b-8589-99c1bc159f0b\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.716058 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities\") pod \"65c6bb33-1df9-446b-8589-99c1bc159f0b\" (UID: \"65c6bb33-1df9-446b-8589-99c1bc159f0b\") " Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.716918 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities" (OuterVolumeSpecName: "utilities") pod "65c6bb33-1df9-446b-8589-99c1bc159f0b" (UID: "65c6bb33-1df9-446b-8589-99c1bc159f0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.721946 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x" (OuterVolumeSpecName: "kube-api-access-z894x") pod "65c6bb33-1df9-446b-8589-99c1bc159f0b" (UID: "65c6bb33-1df9-446b-8589-99c1bc159f0b"). InnerVolumeSpecName "kube-api-access-z894x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.788081 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65c6bb33-1df9-446b-8589-99c1bc159f0b" (UID: "65c6bb33-1df9-446b-8589-99c1bc159f0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.818655 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.818692 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c6bb33-1df9-446b-8589-99c1bc159f0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:26 crc kubenswrapper[4869]: I0218 06:40:26.818703 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z894x\" (UniqueName: \"kubernetes.io/projected/65c6bb33-1df9-446b-8589-99c1bc159f0b-kube-api-access-z894x\") on node \"crc\" DevicePath \"\"" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.068496 4869 generic.go:334] "Generic (PLEG): container finished" podID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerID="d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f" exitCode=0 Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.068546 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerDied","Data":"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f"} Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.068551 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7vfx" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.068578 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7vfx" event={"ID":"65c6bb33-1df9-446b-8589-99c1bc159f0b","Type":"ContainerDied","Data":"c5cfd4708570a610054917a258599b0bfe081db034ee27280c03329b6741a5e2"} Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.068598 4869 scope.go:117] "RemoveContainer" containerID="d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.094726 4869 scope.go:117] "RemoveContainer" containerID="b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.102130 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.111194 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t7vfx"] Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.134433 4869 scope.go:117] "RemoveContainer" containerID="d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.160097 4869 scope.go:117] "RemoveContainer" containerID="d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f" Feb 18 06:40:27 crc kubenswrapper[4869]: E0218 06:40:27.160518 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f\": container with ID starting with d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f not found: ID does not exist" containerID="d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.160550 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f"} err="failed to get container status \"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f\": rpc error: code = NotFound desc = could not find container \"d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f\": container with ID starting with d6c37894396883bd99f5b1c3fe37808f66de1adaaf356b40be47c7b7df9d8d5f not found: ID does not exist" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.160573 4869 scope.go:117] "RemoveContainer" containerID="b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220" Feb 18 06:40:27 crc kubenswrapper[4869]: E0218 06:40:27.160905 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220\": container with ID starting with b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220 not found: ID does not exist" containerID="b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.160954 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220"} err="failed to get container status \"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220\": rpc error: code = NotFound desc = could not find container \"b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220\": container with ID starting with b77140a7d5fb6824d6821f00a1fba4e90616504dd505494b3db5c97bd3f34220 not found: ID does not exist" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.160985 4869 scope.go:117] "RemoveContainer" containerID="d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc" Feb 18 06:40:27 crc kubenswrapper[4869]: E0218 06:40:27.161353 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc\": container with ID starting with d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc not found: ID does not exist" containerID="d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.161385 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc"} err="failed to get container status \"d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc\": rpc error: code = NotFound desc = could not find container \"d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc\": container with ID starting with d50ee5f43114a46d680a9585f3afa1cb8e7adcde2b5a693b5b5bc721e3e604bc not found: ID does not exist" Feb 18 06:40:27 crc kubenswrapper[4869]: I0218 06:40:27.479527 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" path="/var/lib/kubelet/pods/65c6bb33-1df9-446b-8589-99c1bc159f0b/volumes" Feb 18 06:40:35 crc kubenswrapper[4869]: I0218 06:40:35.470343 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:40:35 crc kubenswrapper[4869]: E0218 06:40:35.471998 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:40:46 crc kubenswrapper[4869]: I0218 06:40:46.470786 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:40:46 crc kubenswrapper[4869]: E0218 06:40:46.471547 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:41:01 crc kubenswrapper[4869]: I0218 06:41:01.470916 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:41:01 crc kubenswrapper[4869]: E0218 06:41:01.471677 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:41:15 crc kubenswrapper[4869]: I0218 06:41:15.487778 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:41:15 crc kubenswrapper[4869]: E0218 06:41:15.488847 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.955019 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956069 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="extract-utilities" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956088 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="extract-utilities" Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956108 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="extract-utilities" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956117 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="extract-utilities" Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956129 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956136 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956154 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="extract-content" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956161 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="extract-content" Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956179 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="extract-content" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956185 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="extract-content" Feb 18 06:41:22 crc kubenswrapper[4869]: E0218 06:41:22.956206 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956213 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956437 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c6bb33-1df9-446b-8589-99c1bc159f0b" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.956455 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d70f15b-ac2e-467c-b06e-8bbdf0055651" containerName="registry-server" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.958496 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:22 crc kubenswrapper[4869]: I0218 06:41:22.966659 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.041980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvttc\" (UniqueName: \"kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.042033 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.042067 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.143720 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvttc\" (UniqueName: \"kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.143784 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.143821 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.144246 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.144361 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.152566 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.158994 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.172533 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.206043 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvttc\" (UniqueName: \"kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc\") pod \"redhat-marketplace-lf7kh\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.246086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6kt4\" (UniqueName: \"kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.246280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.246358 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.282671 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.348391 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.348800 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.348863 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6kt4\" (UniqueName: \"kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.349197 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.349493 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.385538 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6kt4\" (UniqueName: \"kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4\") pod \"redhat-operators-wvxdd\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.575204 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:23 crc kubenswrapper[4869]: I0218 06:41:23.810367 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.058509 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:24 crc kubenswrapper[4869]: W0218 06:41:24.062520 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d58e670_3a92_4f04_8a1b_ace037634fea.slice/crio-adb23754bde2a18e705dc2576c587cda33d8af85c374d017b0a9a6db99de73a5 WatchSource:0}: Error finding container adb23754bde2a18e705dc2576c587cda33d8af85c374d017b0a9a6db99de73a5: Status 404 returned error can't find the container with id adb23754bde2a18e705dc2576c587cda33d8af85c374d017b0a9a6db99de73a5 Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.596125 4869 generic.go:334] "Generic (PLEG): container finished" podID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerID="4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be" exitCode=0 Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.596182 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerDied","Data":"4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be"} Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.596241 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerStarted","Data":"def4cdcc175f9a1e9452b8340cc674fafcbbf535d4b3d1c927e030c9160f0cbf"} Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.599421 4869 generic.go:334] "Generic (PLEG): container finished" podID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerID="0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf" exitCode=0 Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.599464 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerDied","Data":"0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf"} Feb 18 06:41:24 crc kubenswrapper[4869]: I0218 06:41:24.599490 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerStarted","Data":"adb23754bde2a18e705dc2576c587cda33d8af85c374d017b0a9a6db99de73a5"} Feb 18 06:41:25 crc kubenswrapper[4869]: I0218 06:41:25.610827 4869 generic.go:334] "Generic (PLEG): container finished" podID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerID="de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d" exitCode=0 Feb 18 06:41:25 crc kubenswrapper[4869]: I0218 06:41:25.611176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerDied","Data":"de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d"} Feb 18 06:41:25 crc kubenswrapper[4869]: I0218 06:41:25.617711 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerStarted","Data":"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb"} Feb 18 06:41:26 crc kubenswrapper[4869]: I0218 06:41:26.627320 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerStarted","Data":"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0"} Feb 18 06:41:26 crc kubenswrapper[4869]: I0218 06:41:26.648660 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lf7kh" podStartSLOduration=3.2812116270000002 podStartE2EDuration="4.648641063s" podCreationTimestamp="2026-02-18 06:41:22 +0000 UTC" firstStartedPulling="2026-02-18 06:41:24.598144915 +0000 UTC m=+3181.767233147" lastFinishedPulling="2026-02-18 06:41:25.965574351 +0000 UTC m=+3183.134662583" observedRunningTime="2026-02-18 06:41:26.642926776 +0000 UTC m=+3183.812015008" watchObservedRunningTime="2026-02-18 06:41:26.648641063 +0000 UTC m=+3183.817729295" Feb 18 06:41:29 crc kubenswrapper[4869]: I0218 06:41:29.653934 4869 generic.go:334] "Generic (PLEG): container finished" podID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerID="d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb" exitCode=0 Feb 18 06:41:29 crc kubenswrapper[4869]: I0218 06:41:29.654139 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerDied","Data":"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb"} Feb 18 06:41:30 crc kubenswrapper[4869]: I0218 06:41:30.469757 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:41:30 crc kubenswrapper[4869]: E0218 06:41:30.470340 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:41:30 crc kubenswrapper[4869]: I0218 06:41:30.663774 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerStarted","Data":"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54"} Feb 18 06:41:30 crc kubenswrapper[4869]: I0218 06:41:30.691068 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvxdd" podStartSLOduration=2.151611665 podStartE2EDuration="7.691049271s" podCreationTimestamp="2026-02-18 06:41:23 +0000 UTC" firstStartedPulling="2026-02-18 06:41:24.600895761 +0000 UTC m=+3181.769983993" lastFinishedPulling="2026-02-18 06:41:30.140333367 +0000 UTC m=+3187.309421599" observedRunningTime="2026-02-18 06:41:30.682201309 +0000 UTC m=+3187.851289541" watchObservedRunningTime="2026-02-18 06:41:30.691049271 +0000 UTC m=+3187.860137503" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.283090 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.283780 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.347205 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.575310 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.575375 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:33 crc kubenswrapper[4869]: I0218 06:41:33.749926 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:34 crc kubenswrapper[4869]: I0218 06:41:34.629510 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvxdd" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="registry-server" probeResult="failure" output=< Feb 18 06:41:34 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:41:34 crc kubenswrapper[4869]: > Feb 18 06:41:36 crc kubenswrapper[4869]: I0218 06:41:36.349622 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:36 crc kubenswrapper[4869]: I0218 06:41:36.718828 4869 generic.go:334] "Generic (PLEG): container finished" podID="4450687d-212c-4577-9511-05a7f072b274" containerID="f558503deb74af893a9669958490c6a8cbb416c1d93dd1fdfac0ccbc26bb3770" exitCode=0 Feb 18 06:41:36 crc kubenswrapper[4869]: I0218 06:41:36.719111 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lf7kh" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="registry-server" containerID="cri-o://3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0" gracePeriod=2 Feb 18 06:41:36 crc kubenswrapper[4869]: I0218 06:41:36.719199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4450687d-212c-4577-9511-05a7f072b274","Type":"ContainerDied","Data":"f558503deb74af893a9669958490c6a8cbb416c1d93dd1fdfac0ccbc26bb3770"} Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.287598 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.352375 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content\") pod \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.352575 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities\") pod \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.353595 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities" (OuterVolumeSpecName: "utilities") pod "22cb115d-afbf-4ee2-ad2f-59ebd862b386" (UID: "22cb115d-afbf-4ee2-ad2f-59ebd862b386"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.353698 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvttc\" (UniqueName: \"kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc\") pod \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\" (UID: \"22cb115d-afbf-4ee2-ad2f-59ebd862b386\") " Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.354873 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.359803 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc" (OuterVolumeSpecName: "kube-api-access-pvttc") pod "22cb115d-afbf-4ee2-ad2f-59ebd862b386" (UID: "22cb115d-afbf-4ee2-ad2f-59ebd862b386"). InnerVolumeSpecName "kube-api-access-pvttc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.381889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22cb115d-afbf-4ee2-ad2f-59ebd862b386" (UID: "22cb115d-afbf-4ee2-ad2f-59ebd862b386"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.456563 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cb115d-afbf-4ee2-ad2f-59ebd862b386-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.456609 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvttc\" (UniqueName: \"kubernetes.io/projected/22cb115d-afbf-4ee2-ad2f-59ebd862b386-kube-api-access-pvttc\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.728799 4869 generic.go:334] "Generic (PLEG): container finished" podID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerID="3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0" exitCode=0 Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.728882 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf7kh" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.728890 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerDied","Data":"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0"} Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.728927 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf7kh" event={"ID":"22cb115d-afbf-4ee2-ad2f-59ebd862b386","Type":"ContainerDied","Data":"def4cdcc175f9a1e9452b8340cc674fafcbbf535d4b3d1c927e030c9160f0cbf"} Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.728947 4869 scope.go:117] "RemoveContainer" containerID="3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.760866 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.763938 4869 scope.go:117] "RemoveContainer" containerID="de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.770141 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf7kh"] Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.783270 4869 scope.go:117] "RemoveContainer" containerID="4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.826104 4869 scope.go:117] "RemoveContainer" containerID="3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0" Feb 18 06:41:37 crc kubenswrapper[4869]: E0218 06:41:37.828380 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0\": container with ID starting with 3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0 not found: ID does not exist" containerID="3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.828426 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0"} err="failed to get container status \"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0\": rpc error: code = NotFound desc = could not find container \"3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0\": container with ID starting with 3c7e08f34eb086b489cee8dde465dd4331e25387842b0843536af050cbf463d0 not found: ID does not exist" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.828467 4869 scope.go:117] "RemoveContainer" containerID="de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d" Feb 18 06:41:37 crc kubenswrapper[4869]: E0218 06:41:37.829016 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d\": container with ID starting with de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d not found: ID does not exist" containerID="de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.829050 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d"} err="failed to get container status \"de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d\": rpc error: code = NotFound desc = could not find container \"de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d\": container with ID starting with de624c6744d2ead6eed23a74c37bbe90ca0b3b2313fefdb4a4f51472035c3f4d not found: ID does not exist" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.829076 4869 scope.go:117] "RemoveContainer" containerID="4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be" Feb 18 06:41:37 crc kubenswrapper[4869]: E0218 06:41:37.829350 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be\": container with ID starting with 4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be not found: ID does not exist" containerID="4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be" Feb 18 06:41:37 crc kubenswrapper[4869]: I0218 06:41:37.829371 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be"} err="failed to get container status \"4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be\": rpc error: code = NotFound desc = could not find container \"4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be\": container with ID starting with 4bfdefd4595547511bb5f7499f7ba7cb88c816b61e4d0ab4dd0835b1c262a5be not found: ID does not exist" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.089971 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.167249 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl2k5\" (UniqueName: \"kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.167517 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.167596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.167709 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.168169 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.168373 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.168585 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.168663 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.168800 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.172758 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data" (OuterVolumeSpecName: "config-data") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.172966 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5" (OuterVolumeSpecName: "kube-api-access-sl2k5") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "kube-api-access-sl2k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.178284 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.178355 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.179041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret\") pod \"4450687d-212c-4577-9511-05a7f072b274\" (UID: \"4450687d-212c-4577-9511-05a7f072b274\") " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.180514 4869 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.180613 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.180698 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.180787 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl2k5\" (UniqueName: \"kubernetes.io/projected/4450687d-212c-4577-9511-05a7f072b274-kube-api-access-sl2k5\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.180861 4869 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4450687d-212c-4577-9511-05a7f072b274-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.201084 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.206348 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.206528 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.211079 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.222275 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4450687d-212c-4577-9511-05a7f072b274" (UID: "4450687d-212c-4577-9511-05a7f072b274"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.282824 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.283129 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.283193 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4450687d-212c-4577-9511-05a7f072b274-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.283327 4869 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.283391 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4450687d-212c-4577-9511-05a7f072b274-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.744471 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4450687d-212c-4577-9511-05a7f072b274","Type":"ContainerDied","Data":"3c5c686cc0beb881c833427e85ac6704e47c230c3f2cd4cf93a87f6b56acdb15"} Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.745160 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5c686cc0beb881c833427e85ac6704e47c230c3f2cd4cf93a87f6b56acdb15" Feb 18 06:41:38 crc kubenswrapper[4869]: I0218 06:41:38.745000 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:41:39 crc kubenswrapper[4869]: I0218 06:41:39.484922 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" path="/var/lib/kubelet/pods/22cb115d-afbf-4ee2-ad2f-59ebd862b386/volumes" Feb 18 06:41:42 crc kubenswrapper[4869]: I0218 06:41:42.470808 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:41:42 crc kubenswrapper[4869]: E0218 06:41:42.471615 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:41:43 crc kubenswrapper[4869]: I0218 06:41:43.646503 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:43 crc kubenswrapper[4869]: I0218 06:41:43.699266 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:43 crc kubenswrapper[4869]: I0218 06:41:43.887701 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:44 crc kubenswrapper[4869]: I0218 06:41:44.795876 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvxdd" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="registry-server" containerID="cri-o://bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54" gracePeriod=2 Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.282249 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.418372 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities\") pod \"1d58e670-3a92-4f04-8a1b-ace037634fea\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.418522 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6kt4\" (UniqueName: \"kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4\") pod \"1d58e670-3a92-4f04-8a1b-ace037634fea\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.418655 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content\") pod \"1d58e670-3a92-4f04-8a1b-ace037634fea\" (UID: \"1d58e670-3a92-4f04-8a1b-ace037634fea\") " Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.419437 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities" (OuterVolumeSpecName: "utilities") pod "1d58e670-3a92-4f04-8a1b-ace037634fea" (UID: "1d58e670-3a92-4f04-8a1b-ace037634fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.425204 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4" (OuterVolumeSpecName: "kube-api-access-r6kt4") pod "1d58e670-3a92-4f04-8a1b-ace037634fea" (UID: "1d58e670-3a92-4f04-8a1b-ace037634fea"). InnerVolumeSpecName "kube-api-access-r6kt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.520450 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.520479 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6kt4\" (UniqueName: \"kubernetes.io/projected/1d58e670-3a92-4f04-8a1b-ace037634fea-kube-api-access-r6kt4\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.538123 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d58e670-3a92-4f04-8a1b-ace037634fea" (UID: "1d58e670-3a92-4f04-8a1b-ace037634fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.624599 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d58e670-3a92-4f04-8a1b-ace037634fea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.806981 4869 generic.go:334] "Generic (PLEG): container finished" podID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerID="bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54" exitCode=0 Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.807039 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxdd" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.807034 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerDied","Data":"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54"} Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.807172 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxdd" event={"ID":"1d58e670-3a92-4f04-8a1b-ace037634fea","Type":"ContainerDied","Data":"adb23754bde2a18e705dc2576c587cda33d8af85c374d017b0a9a6db99de73a5"} Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.807195 4869 scope.go:117] "RemoveContainer" containerID="bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.848891 4869 scope.go:117] "RemoveContainer" containerID="d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.875571 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.885433 4869 scope.go:117] "RemoveContainer" containerID="0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.886111 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvxdd"] Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.919882 4869 scope.go:117] "RemoveContainer" containerID="bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54" Feb 18 06:41:45 crc kubenswrapper[4869]: E0218 06:41:45.920284 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54\": container with ID starting with bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54 not found: ID does not exist" containerID="bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.920313 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54"} err="failed to get container status \"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54\": rpc error: code = NotFound desc = could not find container \"bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54\": container with ID starting with bb604f6dfbac873809d800450710c0f8d88001be1783a8351f3c7acc30791b54 not found: ID does not exist" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.920334 4869 scope.go:117] "RemoveContainer" containerID="d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb" Feb 18 06:41:45 crc kubenswrapper[4869]: E0218 06:41:45.920916 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb\": container with ID starting with d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb not found: ID does not exist" containerID="d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.920937 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb"} err="failed to get container status \"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb\": rpc error: code = NotFound desc = could not find container \"d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb\": container with ID starting with d2947d64bd39749a133f1b050460ead53b661f38ea33a9e9fcbbe48cf0a9debb not found: ID does not exist" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.920949 4869 scope.go:117] "RemoveContainer" containerID="0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf" Feb 18 06:41:45 crc kubenswrapper[4869]: E0218 06:41:45.921116 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf\": container with ID starting with 0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf not found: ID does not exist" containerID="0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf" Feb 18 06:41:45 crc kubenswrapper[4869]: I0218 06:41:45.921134 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf"} err="failed to get container status \"0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf\": rpc error: code = NotFound desc = could not find container \"0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf\": container with ID starting with 0cf43e32780714d4e7b0d8a6b00db3a406a886da71f0170561f12abf16aeb0cf not found: ID does not exist" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.419220 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.419897 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="extract-utilities" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.419919 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="extract-utilities" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.419942 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.419951 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.419983 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="extract-utilities" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.419991 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="extract-utilities" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.420008 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420013 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.420024 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="extract-content" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420030 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="extract-content" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.420039 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4450687d-212c-4577-9511-05a7f072b274" containerName="tempest-tests-tempest-tests-runner" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420045 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4450687d-212c-4577-9511-05a7f072b274" containerName="tempest-tests-tempest-tests-runner" Feb 18 06:41:46 crc kubenswrapper[4869]: E0218 06:41:46.420055 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="extract-content" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420060 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="extract-content" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420224 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420233 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4450687d-212c-4577-9511-05a7f072b274" containerName="tempest-tests-tempest-tests-runner" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420246 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cb115d-afbf-4ee2-ad2f-59ebd862b386" containerName="registry-server" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.420812 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.422786 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6ts9p" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.428000 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.539976 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7d65\" (UniqueName: \"kubernetes.io/projected/498e7aa0-bb28-4bf6-919c-fef01def7f3d-kube-api-access-z7d65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.540057 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.642147 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7d65\" (UniqueName: \"kubernetes.io/projected/498e7aa0-bb28-4bf6-919c-fef01def7f3d-kube-api-access-z7d65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.642257 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.642920 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.662135 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7d65\" (UniqueName: \"kubernetes.io/projected/498e7aa0-bb28-4bf6-919c-fef01def7f3d-kube-api-access-z7d65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.678419 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"498e7aa0-bb28-4bf6-919c-fef01def7f3d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:46 crc kubenswrapper[4869]: I0218 06:41:46.745929 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 06:41:47 crc kubenswrapper[4869]: I0218 06:41:47.204948 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 06:41:47 crc kubenswrapper[4869]: W0218 06:41:47.210254 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod498e7aa0_bb28_4bf6_919c_fef01def7f3d.slice/crio-52087c23ceeb89792a5bfe88ba64c6e3ac422968dc5aa39514907ca397ce2e0a WatchSource:0}: Error finding container 52087c23ceeb89792a5bfe88ba64c6e3ac422968dc5aa39514907ca397ce2e0a: Status 404 returned error can't find the container with id 52087c23ceeb89792a5bfe88ba64c6e3ac422968dc5aa39514907ca397ce2e0a Feb 18 06:41:47 crc kubenswrapper[4869]: I0218 06:41:47.478719 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d58e670-3a92-4f04-8a1b-ace037634fea" path="/var/lib/kubelet/pods/1d58e670-3a92-4f04-8a1b-ace037634fea/volumes" Feb 18 06:41:47 crc kubenswrapper[4869]: I0218 06:41:47.834807 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"498e7aa0-bb28-4bf6-919c-fef01def7f3d","Type":"ContainerStarted","Data":"52087c23ceeb89792a5bfe88ba64c6e3ac422968dc5aa39514907ca397ce2e0a"} Feb 18 06:41:48 crc kubenswrapper[4869]: I0218 06:41:48.851101 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"498e7aa0-bb28-4bf6-919c-fef01def7f3d","Type":"ContainerStarted","Data":"9f8ab84149ca4ea5ab929147abf254339ce075ef8fdc1cd34639204ba6f3b9fc"} Feb 18 06:41:48 crc kubenswrapper[4869]: I0218 06:41:48.864780 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.987032438 podStartE2EDuration="2.864761705s" podCreationTimestamp="2026-02-18 06:41:46 +0000 UTC" firstStartedPulling="2026-02-18 06:41:47.216715761 +0000 UTC m=+3204.385803993" lastFinishedPulling="2026-02-18 06:41:48.094445028 +0000 UTC m=+3205.263533260" observedRunningTime="2026-02-18 06:41:48.864444787 +0000 UTC m=+3206.033533059" watchObservedRunningTime="2026-02-18 06:41:48.864761705 +0000 UTC m=+3206.033849957" Feb 18 06:41:57 crc kubenswrapper[4869]: I0218 06:41:57.471081 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:41:57 crc kubenswrapper[4869]: E0218 06:41:57.472670 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:42:08 crc kubenswrapper[4869]: I0218 06:42:08.470878 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:42:08 crc kubenswrapper[4869]: E0218 06:42:08.471777 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.256006 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lb6s/must-gather-s4scs"] Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.259337 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.266570 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5lb6s"/"kube-root-ca.crt" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.266581 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5lb6s"/"openshift-service-ca.crt" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.277148 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lb6s/must-gather-s4scs"] Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.397280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxfw\" (UniqueName: \"kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.397365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.500111 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxfw\" (UniqueName: \"kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.500176 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.500715 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.520915 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxfw\" (UniqueName: \"kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw\") pod \"must-gather-s4scs\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:09 crc kubenswrapper[4869]: I0218 06:42:09.587253 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:42:10 crc kubenswrapper[4869]: I0218 06:42:10.019370 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lb6s/must-gather-s4scs"] Feb 18 06:42:10 crc kubenswrapper[4869]: I0218 06:42:10.083017 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/must-gather-s4scs" event={"ID":"37a8150e-3880-4f37-a4e2-57d498278882","Type":"ContainerStarted","Data":"2e4572e8a3775a899ded94d0a55ae328b376d118faaaed7341c8424ebcfc48b1"} Feb 18 06:42:17 crc kubenswrapper[4869]: I0218 06:42:17.173495 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/must-gather-s4scs" event={"ID":"37a8150e-3880-4f37-a4e2-57d498278882","Type":"ContainerStarted","Data":"7c7196dc21bef9696b4b78905f9a759846b43fa4d6e0029ce601017856ab7d45"} Feb 18 06:42:17 crc kubenswrapper[4869]: I0218 06:42:17.174041 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/must-gather-s4scs" event={"ID":"37a8150e-3880-4f37-a4e2-57d498278882","Type":"ContainerStarted","Data":"18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c"} Feb 18 06:42:17 crc kubenswrapper[4869]: I0218 06:42:17.190472 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lb6s/must-gather-s4scs" podStartSLOduration=2.2549789159999998 podStartE2EDuration="8.190454821s" podCreationTimestamp="2026-02-18 06:42:09 +0000 UTC" firstStartedPulling="2026-02-18 06:42:10.023530056 +0000 UTC m=+3227.192618288" lastFinishedPulling="2026-02-18 06:42:15.959005961 +0000 UTC m=+3233.128094193" observedRunningTime="2026-02-18 06:42:17.185967843 +0000 UTC m=+3234.355056135" watchObservedRunningTime="2026-02-18 06:42:17.190454821 +0000 UTC m=+3234.359543053" Feb 18 06:42:19 crc kubenswrapper[4869]: I0218 06:42:19.944486 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-tc525"] Feb 18 06:42:19 crc kubenswrapper[4869]: I0218 06:42:19.945917 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:19 crc kubenswrapper[4869]: I0218 06:42:19.949261 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5lb6s"/"default-dockercfg-sgtf6" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.060782 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.060878 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khq8\" (UniqueName: \"kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.164669 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.164731 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7khq8\" (UniqueName: \"kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.164860 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.184946 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khq8\" (UniqueName: \"kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8\") pod \"crc-debug-tc525\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: I0218 06:42:20.267569 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:42:20 crc kubenswrapper[4869]: W0218 06:42:20.314660 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ccd1485_e685_4b7c_9a99_eb1781353464.slice/crio-441d278c984c186acba2d24b38f8854b84464ccc0334ca05634a2e00065ea7dd WatchSource:0}: Error finding container 441d278c984c186acba2d24b38f8854b84464ccc0334ca05634a2e00065ea7dd: Status 404 returned error can't find the container with id 441d278c984c186acba2d24b38f8854b84464ccc0334ca05634a2e00065ea7dd Feb 18 06:42:21 crc kubenswrapper[4869]: I0218 06:42:21.216598 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-tc525" event={"ID":"7ccd1485-e685-4b7c-9a99-eb1781353464","Type":"ContainerStarted","Data":"441d278c984c186acba2d24b38f8854b84464ccc0334ca05634a2e00065ea7dd"} Feb 18 06:42:23 crc kubenswrapper[4869]: I0218 06:42:23.479417 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:42:24 crc kubenswrapper[4869]: I0218 06:42:24.249039 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1"} Feb 18 06:42:33 crc kubenswrapper[4869]: I0218 06:42:33.349127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-tc525" event={"ID":"7ccd1485-e685-4b7c-9a99-eb1781353464","Type":"ContainerStarted","Data":"dc2119961d45e61eca2885d2a15a4074a1d63a309ba04c8e969e11c980ce8107"} Feb 18 06:42:33 crc kubenswrapper[4869]: I0218 06:42:33.368519 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lb6s/crc-debug-tc525" podStartSLOduration=2.002489732 podStartE2EDuration="14.368500597s" podCreationTimestamp="2026-02-18 06:42:19 +0000 UTC" firstStartedPulling="2026-02-18 06:42:20.317788056 +0000 UTC m=+3237.486876288" lastFinishedPulling="2026-02-18 06:42:32.683798921 +0000 UTC m=+3249.852887153" observedRunningTime="2026-02-18 06:42:33.359690833 +0000 UTC m=+3250.528779075" watchObservedRunningTime="2026-02-18 06:42:33.368500597 +0000 UTC m=+3250.537588829" Feb 18 06:43:17 crc kubenswrapper[4869]: I0218 06:43:17.756889 4869 generic.go:334] "Generic (PLEG): container finished" podID="7ccd1485-e685-4b7c-9a99-eb1781353464" containerID="dc2119961d45e61eca2885d2a15a4074a1d63a309ba04c8e969e11c980ce8107" exitCode=0 Feb 18 06:43:17 crc kubenswrapper[4869]: I0218 06:43:17.756987 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-tc525" event={"ID":"7ccd1485-e685-4b7c-9a99-eb1781353464","Type":"ContainerDied","Data":"dc2119961d45e61eca2885d2a15a4074a1d63a309ba04c8e969e11c980ce8107"} Feb 18 06:43:18 crc kubenswrapper[4869]: I0218 06:43:18.879438 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:43:18 crc kubenswrapper[4869]: I0218 06:43:18.922879 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-tc525"] Feb 18 06:43:18 crc kubenswrapper[4869]: I0218 06:43:18.931510 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-tc525"] Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.053057 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7khq8\" (UniqueName: \"kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8\") pod \"7ccd1485-e685-4b7c-9a99-eb1781353464\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.053272 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host\") pod \"7ccd1485-e685-4b7c-9a99-eb1781353464\" (UID: \"7ccd1485-e685-4b7c-9a99-eb1781353464\") " Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.053409 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host" (OuterVolumeSpecName: "host") pod "7ccd1485-e685-4b7c-9a99-eb1781353464" (UID: "7ccd1485-e685-4b7c-9a99-eb1781353464"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.053986 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ccd1485-e685-4b7c-9a99-eb1781353464-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.059995 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8" (OuterVolumeSpecName: "kube-api-access-7khq8") pod "7ccd1485-e685-4b7c-9a99-eb1781353464" (UID: "7ccd1485-e685-4b7c-9a99-eb1781353464"). InnerVolumeSpecName "kube-api-access-7khq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.155226 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7khq8\" (UniqueName: \"kubernetes.io/projected/7ccd1485-e685-4b7c-9a99-eb1781353464-kube-api-access-7khq8\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.480981 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ccd1485-e685-4b7c-9a99-eb1781353464" path="/var/lib/kubelet/pods/7ccd1485-e685-4b7c-9a99-eb1781353464/volumes" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.780557 4869 scope.go:117] "RemoveContainer" containerID="dc2119961d45e61eca2885d2a15a4074a1d63a309ba04c8e969e11c980ce8107" Feb 18 06:43:19 crc kubenswrapper[4869]: I0218 06:43:19.780636 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-tc525" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.089205 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-krf4f"] Feb 18 06:43:20 crc kubenswrapper[4869]: E0218 06:43:20.089980 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ccd1485-e685-4b7c-9a99-eb1781353464" containerName="container-00" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.089996 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ccd1485-e685-4b7c-9a99-eb1781353464" containerName="container-00" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.090242 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ccd1485-e685-4b7c-9a99-eb1781353464" containerName="container-00" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.091056 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.095349 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5lb6s"/"default-dockercfg-sgtf6" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.275597 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jr95\" (UniqueName: \"kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.275663 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.377553 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jr95\" (UniqueName: \"kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.377630 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.377772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.400908 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jr95\" (UniqueName: \"kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95\") pod \"crc-debug-krf4f\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.408099 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.793214 4869 generic.go:334] "Generic (PLEG): container finished" podID="4f1019e8-db94-419b-a4ae-3798de5a9e73" containerID="c009ad8450a85ba7f7779c5d53778f7cd1bd186314349898739d26b361dfa606" exitCode=0 Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.793270 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" event={"ID":"4f1019e8-db94-419b-a4ae-3798de5a9e73","Type":"ContainerDied","Data":"c009ad8450a85ba7f7779c5d53778f7cd1bd186314349898739d26b361dfa606"} Feb 18 06:43:20 crc kubenswrapper[4869]: I0218 06:43:20.793343 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" event={"ID":"4f1019e8-db94-419b-a4ae-3798de5a9e73","Type":"ContainerStarted","Data":"99dcb318556081efd6cdc7b23119f17302dbfc457efcdd53e6ff40d38f624013"} Feb 18 06:43:21 crc kubenswrapper[4869]: I0218 06:43:21.362911 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-krf4f"] Feb 18 06:43:21 crc kubenswrapper[4869]: I0218 06:43:21.372266 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-krf4f"] Feb 18 06:43:21 crc kubenswrapper[4869]: I0218 06:43:21.895678 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.006814 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jr95\" (UniqueName: \"kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95\") pod \"4f1019e8-db94-419b-a4ae-3798de5a9e73\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.007044 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host\") pod \"4f1019e8-db94-419b-a4ae-3798de5a9e73\" (UID: \"4f1019e8-db94-419b-a4ae-3798de5a9e73\") " Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.007127 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host" (OuterVolumeSpecName: "host") pod "4f1019e8-db94-419b-a4ae-3798de5a9e73" (UID: "4f1019e8-db94-419b-a4ae-3798de5a9e73"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.007601 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1019e8-db94-419b-a4ae-3798de5a9e73-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.020911 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95" (OuterVolumeSpecName: "kube-api-access-4jr95") pod "4f1019e8-db94-419b-a4ae-3798de5a9e73" (UID: "4f1019e8-db94-419b-a4ae-3798de5a9e73"). InnerVolumeSpecName "kube-api-access-4jr95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.109465 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jr95\" (UniqueName: \"kubernetes.io/projected/4f1019e8-db94-419b-a4ae-3798de5a9e73-kube-api-access-4jr95\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.536047 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-c7f8b"] Feb 18 06:43:22 crc kubenswrapper[4869]: E0218 06:43:22.536923 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1019e8-db94-419b-a4ae-3798de5a9e73" containerName="container-00" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.536948 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1019e8-db94-419b-a4ae-3798de5a9e73" containerName="container-00" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.537297 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1019e8-db94-419b-a4ae-3798de5a9e73" containerName="container-00" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.538525 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.724334 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.724987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjjb\" (UniqueName: \"kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.810666 4869 scope.go:117] "RemoveContainer" containerID="c009ad8450a85ba7f7779c5d53778f7cd1bd186314349898739d26b361dfa606" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.810723 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-krf4f" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.827633 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.827785 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.827813 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjjb\" (UniqueName: \"kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.844294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjjb\" (UniqueName: \"kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb\") pod \"crc-debug-c7f8b\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: I0218 06:43:22.859102 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:22 crc kubenswrapper[4869]: W0218 06:43:22.892604 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod059f550c_4150_4931_85b8_68fec5650761.slice/crio-1fb43159caddcf87d8193cbd93392cc497bbac5cb29993a829e78e1fa23d09a0 WatchSource:0}: Error finding container 1fb43159caddcf87d8193cbd93392cc497bbac5cb29993a829e78e1fa23d09a0: Status 404 returned error can't find the container with id 1fb43159caddcf87d8193cbd93392cc497bbac5cb29993a829e78e1fa23d09a0 Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.485146 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1019e8-db94-419b-a4ae-3798de5a9e73" path="/var/lib/kubelet/pods/4f1019e8-db94-419b-a4ae-3798de5a9e73/volumes" Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.819950 4869 generic.go:334] "Generic (PLEG): container finished" podID="059f550c-4150-4931-85b8-68fec5650761" containerID="a3caa97f898060873242b3639dd3f8b09cd8811477e849e009e1aa8b0ef2adef" exitCode=0 Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.820023 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" event={"ID":"059f550c-4150-4931-85b8-68fec5650761","Type":"ContainerDied","Data":"a3caa97f898060873242b3639dd3f8b09cd8811477e849e009e1aa8b0ef2adef"} Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.820053 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" event={"ID":"059f550c-4150-4931-85b8-68fec5650761","Type":"ContainerStarted","Data":"1fb43159caddcf87d8193cbd93392cc497bbac5cb29993a829e78e1fa23d09a0"} Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.871862 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-c7f8b"] Feb 18 06:43:23 crc kubenswrapper[4869]: I0218 06:43:23.888511 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5lb6s/crc-debug-c7f8b"] Feb 18 06:43:24 crc kubenswrapper[4869]: I0218 06:43:24.939055 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.070024 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host\") pod \"059f550c-4150-4931-85b8-68fec5650761\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.070464 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmjjb\" (UniqueName: \"kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb\") pod \"059f550c-4150-4931-85b8-68fec5650761\" (UID: \"059f550c-4150-4931-85b8-68fec5650761\") " Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.071766 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host" (OuterVolumeSpecName: "host") pod "059f550c-4150-4931-85b8-68fec5650761" (UID: "059f550c-4150-4931-85b8-68fec5650761"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.083616 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb" (OuterVolumeSpecName: "kube-api-access-jmjjb") pod "059f550c-4150-4931-85b8-68fec5650761" (UID: "059f550c-4150-4931-85b8-68fec5650761"). InnerVolumeSpecName "kube-api-access-jmjjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.172431 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/059f550c-4150-4931-85b8-68fec5650761-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.172464 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmjjb\" (UniqueName: \"kubernetes.io/projected/059f550c-4150-4931-85b8-68fec5650761-kube-api-access-jmjjb\") on node \"crc\" DevicePath \"\"" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.489652 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059f550c-4150-4931-85b8-68fec5650761" path="/var/lib/kubelet/pods/059f550c-4150-4931-85b8-68fec5650761/volumes" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.839818 4869 scope.go:117] "RemoveContainer" containerID="a3caa97f898060873242b3639dd3f8b09cd8811477e849e009e1aa8b0ef2adef" Feb 18 06:43:25 crc kubenswrapper[4869]: I0218 06:43:25.839853 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/crc-debug-c7f8b" Feb 18 06:43:38 crc kubenswrapper[4869]: I0218 06:43:38.772961 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85cc9b8698-mzcgf_00e28946-38e8-4b00-8181-d45908ad9863/barbican-api/0.log" Feb 18 06:43:38 crc kubenswrapper[4869]: I0218 06:43:38.890371 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85cc9b8698-mzcgf_00e28946-38e8-4b00-8181-d45908ad9863/barbican-api-log/0.log" Feb 18 06:43:38 crc kubenswrapper[4869]: I0218 06:43:38.933813 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5884488646-54ch2_4ed72952-b72a-4f66-8e63-84d18936ff3a/barbican-keystone-listener/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.043296 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5884488646-54ch2_4ed72952-b72a-4f66-8e63-84d18936ff3a/barbican-keystone-listener-log/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.117686 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98f68fcf5-pz8rv_c3b79627-ea10-4a59-a5ae-f24d3ace238d/barbican-worker/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.152950 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98f68fcf5-pz8rv_c3b79627-ea10-4a59-a5ae-f24d3ace238d/barbican-worker-log/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.317331 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw_26716094-10bf-4523-9c23-674dd4b7d517/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.370608 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/ceilometer-central-agent/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.448109 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/ceilometer-notification-agent/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.492653 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/sg-core/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.511768 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/proxy-httpd/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.674604 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e110f65d-fa29-4024-a0d8-352543bd0c1b/cinder-api/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.684981 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e110f65d-fa29-4024-a0d8-352543bd0c1b/cinder-api-log/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.843812 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c911137c-7aa9-4875-ae81-e91caebd828a/cinder-scheduler/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.871882 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c911137c-7aa9-4875-ae81-e91caebd828a/probe/0.log" Feb 18 06:43:39 crc kubenswrapper[4869]: I0218 06:43:39.960160 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn_1475668a-1132-4548-a5e6-0f4a459480c1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.054118 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f_8ab4b789-eeaf-4e68-b947-436fc6f6bafa/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.159285 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/init/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.357001 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/init/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.374659 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/dnsmasq-dns/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.418939 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn_102527af-43b3-4260-bdbf-cd653b203986/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.568552 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_469d9354-4653-4a99-b457-3b453082e0e0/glance-log/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.575609 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_469d9354-4653-4a99-b457-3b453082e0e0/glance-httpd/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.734960 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1dfb852b-63ab-46ce-8f5b-3c6be7b02400/glance-httpd/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.736170 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1dfb852b-63ab-46ce-8f5b-3c6be7b02400/glance-log/0.log" Feb 18 06:43:40 crc kubenswrapper[4869]: I0218 06:43:40.912581 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57f5fddd88-qhh5n_391d8fe4-58ea-434e-918f-811b7c3e14b2/horizon/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.029600 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bdr55_9e02b084-943e-4579-87f9-6a0cdff0d8c1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.202470 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cctjr_c1b93caa-11f6-4841-b63c-6542711f26cc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.216567 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57f5fddd88-qhh5n_391d8fe4-58ea-434e-918f-811b7c3e14b2/horizon-log/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.410321 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ecd7b730-0094-40d2-9894-d90d45f8c2de/kube-state-metrics/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.527677 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-666896fcd4-c65vb_e56418e8-0afb-47a6-9064-ff0a381ef2ba/keystone-api/0.log" Feb 18 06:43:41 crc kubenswrapper[4869]: I0218 06:43:41.668232 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w_162af8f6-3123-4d8b-a602-0b2808cd6654/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.031416 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4594595c-zvnfb_dffbc8b7-8080-4958-915d-ee66f5ae732b/neutron-httpd/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.047623 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4594595c-zvnfb_dffbc8b7-8080-4958-915d-ee66f5ae732b/neutron-api/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.262820 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw_4c96cef9-45b8-4639-a368-063acac72c83/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.635027 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1b848f50-bd92-4c86-8f5e-64c4fd3e2521/nova-api-log/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.742520 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1b848f50-bd92-4c86-8f5e-64c4fd3e2521/nova-api-api/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.752076 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eabb4471-00e8-4edb-9128-249b4057d5d7/nova-cell0-conductor-conductor/0.log" Feb 18 06:43:42 crc kubenswrapper[4869]: I0218 06:43:42.910530 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4662bd21-e3c5-4980-bac5-dc1f76c958c3/nova-cell1-conductor-conductor/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.051830 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_50575a1a-5d98-4692-b1ca-d275e90c6fed/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.207366 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8p8x4_7d30bdae-b0f6-49aa-b343-c2b9abc186ba/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.274022 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f/nova-metadata-log/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.667172 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f31fe572-a7a1-48f4-8ebb-e621788c2456/nova-scheduler-scheduler/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.691207 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/mysql-bootstrap/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.873564 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/mysql-bootstrap/0.log" Feb 18 06:43:43 crc kubenswrapper[4869]: I0218 06:43:43.893194 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/galera/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.084257 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/mysql-bootstrap/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.305559 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f/nova-metadata-metadata/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.322413 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/mysql-bootstrap/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.323322 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/galera/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.519067 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_348bdc65-0fd0-4870-adfc-d0d69a51e762/openstackclient/0.log" Feb 18 06:43:44 crc kubenswrapper[4869]: I0218 06:43:44.522220 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6zzxt_0b85434e-56f8-4cab-91a5-8cf0ea0356fc/ovn-controller/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.014707 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-47kkq_7cb275de-f7e9-434d-b934-37dfb39e92ac/openstack-network-exporter/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.084142 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server-init/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.237592 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server-init/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.260987 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovs-vswitchd/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.322854 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.478394 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65ad1fc9-3393-4e57-9041-c17ef5279ddd/openstack-network-exporter/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.492080 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kqrfk_b8020d5a-e997-4376-bef7-488e40f51277/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.547940 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65ad1fc9-3393-4e57-9041-c17ef5279ddd/ovn-northd/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.752109 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_25b0cd0b-8d96-4067-a1da-171e5f0b9545/ovsdbserver-nb/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.806059 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_25b0cd0b-8d96-4067-a1da-171e5f0b9545/openstack-network-exporter/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.929286 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0991303d-6180-44e1-9baa-88ece3cdbfaf/openstack-network-exporter/0.log" Feb 18 06:43:45 crc kubenswrapper[4869]: I0218 06:43:45.976601 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0991303d-6180-44e1-9baa-88ece3cdbfaf/ovsdbserver-sb/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.042342 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55b8768b96-mwv6g_965b4f29-cb41-4066-a9e6-3729ec43b2bd/placement-api/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.267355 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55b8768b96-mwv6g_965b4f29-cb41-4066-a9e6-3729ec43b2bd/placement-log/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.294801 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/setup-container/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.470283 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/rabbitmq/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.527947 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/setup-container/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.554831 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/setup-container/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.785387 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/setup-container/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.799942 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/rabbitmq/0.log" Feb 18 06:43:46 crc kubenswrapper[4869]: I0218 06:43:46.856863 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz_4d6656f7-173a-4e9e-b802-6547876438ec/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.017615 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-h9skj_93c9d860-2ea7-4a81-b383-aae67501c7f8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.103645 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8_a49b5bd1-87a9-4536-b8fb-5f32f8024b8a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.272671 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d4jbn_d4b7d5ea-dca6-4f74-8143-17a7573402d3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.340586 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rbl5_76fa3bfe-8da3-4ceb-95c6-de5473957a3e/ssh-known-hosts-edpm-deployment/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.602027 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-846bf8ff8c-7j4wb_23e9d7e3-bcc7-493e-84a6-e646ab36e6f0/proxy-server/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.621071 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-846bf8ff8c-7j4wb_23e9d7e3-bcc7-493e-84a6-e646ab36e6f0/proxy-httpd/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.716888 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vc4l8_0cb2f895-3d57-468e-8197-636fcc33afe4/swift-ring-rebalance/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.825930 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-reaper/0.log" Feb 18 06:43:47 crc kubenswrapper[4869]: I0218 06:43:47.894651 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-auditor/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.000389 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-server/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.017917 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-auditor/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.051193 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-replicator/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.132360 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-replicator/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.251550 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-server/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.279971 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-auditor/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.327152 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-updater/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.411186 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-expirer/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.470895 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-replicator/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.494697 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-server/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.537983 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-updater/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.648677 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/rsync/0.log" Feb 18 06:43:48 crc kubenswrapper[4869]: I0218 06:43:48.850798 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/swift-recon-cron/0.log" Feb 18 06:43:49 crc kubenswrapper[4869]: I0218 06:43:49.017617 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5_bbec6484-4b0d-477a-832a-9fb69ce89f4a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:49 crc kubenswrapper[4869]: I0218 06:43:49.053144 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4450687d-212c-4577-9511-05a7f072b274/tempest-tests-tempest-tests-runner/0.log" Feb 18 06:43:49 crc kubenswrapper[4869]: I0218 06:43:49.265662 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_498e7aa0-bb28-4bf6-919c-fef01def7f3d/test-operator-logs-container/0.log" Feb 18 06:43:49 crc kubenswrapper[4869]: I0218 06:43:49.330550 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-52cjm_eeb5893c-dc4f-4cb4-b55b-007c03e03889/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:43:56 crc kubenswrapper[4869]: I0218 06:43:56.365882 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a6c89f7e-0259-4c42-9e24-cd8391cda1a3/memcached/0.log" Feb 18 06:44:12 crc kubenswrapper[4869]: I0218 06:44:12.978501 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.166024 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.258461 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.269987 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.445025 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.459989 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.543077 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/extract/0.log" Feb 18 06:44:13 crc kubenswrapper[4869]: I0218 06:44:13.865987 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-8gxgm_77f20e81-cc4d-44ab-9f77-40080cc392ec/manager/0.log" Feb 18 06:44:14 crc kubenswrapper[4869]: I0218 06:44:14.286768 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-gfxvp_02aea0c3-b59c-41dd-9c48-514fd4bfa94c/manager/0.log" Feb 18 06:44:14 crc kubenswrapper[4869]: I0218 06:44:14.494963 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-s9l2b_65820ad0-cf24-499c-b418-8980edb8788a/manager/0.log" Feb 18 06:44:14 crc kubenswrapper[4869]: I0218 06:44:14.692221 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-n5zdw_4a638516-be5b-4a24-9d1a-cc5dbcaac3ed/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.105222 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-knzcc_e5345d15-54e7-4c42-92d2-e3f4d63e9533/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.360817 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-55txz_da28bc19-c4b2-4d9a-8357-6ce9680567ce/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.494279 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-fnrx8_ea6c026b-8825-42ec-8b66-9c2842957c10/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.655947 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-btm9t_3519f676-e828-4ec9-8995-ecf778e36d4f/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.768940 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-5t2ll_3627c187-4d3b-49cb-9367-5758e676b0af/manager/0.log" Feb 18 06:44:15 crc kubenswrapper[4869]: I0218 06:44:15.951310 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-sbq48_835a35ac-1347-46f7-ae71-aa38e8aea7cf/manager/0.log" Feb 18 06:44:16 crc kubenswrapper[4869]: I0218 06:44:16.223293 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-drwmx_cbb0292f-e15f-4a01-bd91-1c155779be07/manager/0.log" Feb 18 06:44:16 crc kubenswrapper[4869]: I0218 06:44:16.312454 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-vfs4v_269fa527-4152-4014-b070-7e651d5f7b2f/manager/0.log" Feb 18 06:44:16 crc kubenswrapper[4869]: I0218 06:44:16.691450 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv_8fdd1d16-0b06-4553-b43a-943fb22f8961/manager/0.log" Feb 18 06:44:17 crc kubenswrapper[4869]: I0218 06:44:17.143531 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-766dc4fc6-rtp2x_a795b61b-e61c-46e5-a72e-e64ca8421756/operator/0.log" Feb 18 06:44:17 crc kubenswrapper[4869]: I0218 06:44:17.336787 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-crznf_5f3e331a-a2a6-4bd2-adce-f586154b805c/registry-server/0.log" Feb 18 06:44:17 crc kubenswrapper[4869]: I0218 06:44:17.635310 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-gbx94_e90352c9-520d-40dc-b9f6-3919a8bd67fb/manager/0.log" Feb 18 06:44:17 crc kubenswrapper[4869]: I0218 06:44:17.863437 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-47sw8_e2de3218-8c57-43ab-b45e-e69a92456549/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.084311 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wfxk8_9d84cc45-ca42-454f-9323-35d717ea7cd4/operator/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.202781 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-2ctsx_78799685-a70e-4b5d-ae0f-fbd4ac1f48fd/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.335132 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-86mkx_f954d2f9-baf9-4d98-bee1-05598035e3a1/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.500202 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-xkrff_63e77cf1-d554-4e43-a6e0-93e671cc90fc/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.533966 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-6nn8b_c7070d2d-1fcc-4ae8-9380-d0f500c95d01/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.741528 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-t6pg6_fe9a7273-de20-4420-8335-dc291458c338/manager/0.log" Feb 18 06:44:18 crc kubenswrapper[4869]: I0218 06:44:18.939594 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-dccc9b448-nffdh_3990868e-7ca4-439b-a244-6a3336628877/manager/0.log" Feb 18 06:44:19 crc kubenswrapper[4869]: I0218 06:44:19.937669 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-q5gnw_c1a14efa-4a9b-49b6-a882-c0d080269850/manager/0.log" Feb 18 06:44:37 crc kubenswrapper[4869]: I0218 06:44:37.045601 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pmqtp_be5b136e-e664-4ed5-9fbd-e2a9bdd06db9/control-plane-machine-set-operator/0.log" Feb 18 06:44:37 crc kubenswrapper[4869]: I0218 06:44:37.233631 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lljlj_7411fb2b-cd62-452d-a5c8-94135752329d/machine-api-operator/0.log" Feb 18 06:44:37 crc kubenswrapper[4869]: I0218 06:44:37.287277 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lljlj_7411fb2b-cd62-452d-a5c8-94135752329d/kube-rbac-proxy/0.log" Feb 18 06:44:40 crc kubenswrapper[4869]: I0218 06:44:40.132371 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:44:40 crc kubenswrapper[4869]: I0218 06:44:40.132713 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:44:49 crc kubenswrapper[4869]: I0218 06:44:49.567641 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-xc26m_2742a7dc-644d-4ede-be60-c014ffd5ad38/cert-manager-controller/0.log" Feb 18 06:44:49 crc kubenswrapper[4869]: I0218 06:44:49.793165 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v52pk_6b66e86e-81f9-44f2-b711-ad17fc1504a6/cert-manager-cainjector/0.log" Feb 18 06:44:49 crc kubenswrapper[4869]: I0218 06:44:49.885156 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zvwk2_6e6c32f7-24fc-4637-9607-bef3c0d85bb7/cert-manager-webhook/0.log" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.147241 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4"] Feb 18 06:45:00 crc kubenswrapper[4869]: E0218 06:45:00.148358 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059f550c-4150-4931-85b8-68fec5650761" containerName="container-00" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.148379 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="059f550c-4150-4931-85b8-68fec5650761" containerName="container-00" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.148586 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="059f550c-4150-4931-85b8-68fec5650761" containerName="container-00" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.149390 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.152957 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.153220 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.156782 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4"] Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.263533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpmfp\" (UniqueName: \"kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.264021 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.264118 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.365271 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.365362 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.365415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpmfp\" (UniqueName: \"kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.366221 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.370834 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.380627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpmfp\" (UniqueName: \"kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp\") pod \"collect-profiles-29523285-9bjg4\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.478728 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:00 crc kubenswrapper[4869]: I0218 06:45:00.910524 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4"] Feb 18 06:45:01 crc kubenswrapper[4869]: I0218 06:45:01.677434 4869 generic.go:334] "Generic (PLEG): container finished" podID="606a64c1-ef3d-4080-ad4d-c2092e6778e6" containerID="2a4a255f4041a71583703ac7f0f605707ee69f49eff3fd2cc07aed6a35bfd8ee" exitCode=0 Feb 18 06:45:01 crc kubenswrapper[4869]: I0218 06:45:01.677818 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" event={"ID":"606a64c1-ef3d-4080-ad4d-c2092e6778e6","Type":"ContainerDied","Data":"2a4a255f4041a71583703ac7f0f605707ee69f49eff3fd2cc07aed6a35bfd8ee"} Feb 18 06:45:01 crc kubenswrapper[4869]: I0218 06:45:01.677849 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" event={"ID":"606a64c1-ef3d-4080-ad4d-c2092e6778e6","Type":"ContainerStarted","Data":"a28f40ccc695e88707ea5a94b455e4abd174d645d2b0e7e7ca0fd2b3c958d850"} Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.074235 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-x7pfd_babd7df7-75bc-470a-a960-c1d0317f2f8e/nmstate-console-plugin/0.log" Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.262837 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kljbt_02b9ed08-af4d-434a-8042-9b4acedf423c/nmstate-handler/0.log" Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.300587 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-tkd86_9c59b69a-72a6-4ce0-9b47-c53016b5ac3a/kube-rbac-proxy/0.log" Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.427476 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-tkd86_9c59b69a-72a6-4ce0-9b47-c53016b5ac3a/nmstate-metrics/0.log" Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.489656 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-mfnzm_851e9bf0-c3bc-454d-a3e3-ade0dd734f5a/nmstate-operator/0.log" Feb 18 06:45:02 crc kubenswrapper[4869]: I0218 06:45:02.665197 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-j5g4n_1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92/nmstate-webhook/0.log" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.016397 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.120339 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume\") pod \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.120542 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpmfp\" (UniqueName: \"kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp\") pod \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.120592 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume\") pod \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\" (UID: \"606a64c1-ef3d-4080-ad4d-c2092e6778e6\") " Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.121876 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "606a64c1-ef3d-4080-ad4d-c2092e6778e6" (UID: "606a64c1-ef3d-4080-ad4d-c2092e6778e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.126664 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "606a64c1-ef3d-4080-ad4d-c2092e6778e6" (UID: "606a64c1-ef3d-4080-ad4d-c2092e6778e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.128955 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp" (OuterVolumeSpecName: "kube-api-access-jpmfp") pod "606a64c1-ef3d-4080-ad4d-c2092e6778e6" (UID: "606a64c1-ef3d-4080-ad4d-c2092e6778e6"). InnerVolumeSpecName "kube-api-access-jpmfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.222805 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/606a64c1-ef3d-4080-ad4d-c2092e6778e6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.222851 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpmfp\" (UniqueName: \"kubernetes.io/projected/606a64c1-ef3d-4080-ad4d-c2092e6778e6-kube-api-access-jpmfp\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.222882 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/606a64c1-ef3d-4080-ad4d-c2092e6778e6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.693043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" event={"ID":"606a64c1-ef3d-4080-ad4d-c2092e6778e6","Type":"ContainerDied","Data":"a28f40ccc695e88707ea5a94b455e4abd174d645d2b0e7e7ca0fd2b3c958d850"} Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.693086 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28f40ccc695e88707ea5a94b455e4abd174d645d2b0e7e7ca0fd2b3c958d850" Feb 18 06:45:03 crc kubenswrapper[4869]: I0218 06:45:03.693166 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-9bjg4" Feb 18 06:45:04 crc kubenswrapper[4869]: I0218 06:45:04.098994 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb"] Feb 18 06:45:04 crc kubenswrapper[4869]: I0218 06:45:04.109654 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-4nxtb"] Feb 18 06:45:05 crc kubenswrapper[4869]: I0218 06:45:05.483670 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b78c004-29c7-4c37-8eac-826aa3f04eb9" path="/var/lib/kubelet/pods/9b78c004-29c7-4c37-8eac-826aa3f04eb9/volumes" Feb 18 06:45:10 crc kubenswrapper[4869]: I0218 06:45:10.132963 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:45:10 crc kubenswrapper[4869]: I0218 06:45:10.133558 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:45:27 crc kubenswrapper[4869]: I0218 06:45:27.618463 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sh2wz_e23bbfea-0160-46be-ae71-7ff977953af2/kube-rbac-proxy/0.log" Feb 18 06:45:27 crc kubenswrapper[4869]: I0218 06:45:27.734758 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sh2wz_e23bbfea-0160-46be-ae71-7ff977953af2/controller/0.log" Feb 18 06:45:27 crc kubenswrapper[4869]: I0218 06:45:27.846731 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.082124 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.086836 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.094401 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.121058 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.291652 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.293154 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.298989 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.325117 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.483569 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.511771 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.534263 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.537725 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/controller/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.696970 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/frr-metrics/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.712424 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/kube-rbac-proxy/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.749381 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/kube-rbac-proxy-frr/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.887498 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/reloader/0.log" Feb 18 06:45:28 crc kubenswrapper[4869]: I0218 06:45:28.955259 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-r6mvd_c67c84c4-b4c3-4336-a6b9-3543258cea17/frr-k8s-webhook-server/0.log" Feb 18 06:45:29 crc kubenswrapper[4869]: I0218 06:45:29.180719 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55df77c686-fqtt5_74eef01f-c0d7-449c-bca9-7eb78f808110/manager/0.log" Feb 18 06:45:29 crc kubenswrapper[4869]: I0218 06:45:29.295115 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69666c74dd-pv6sz_d597c072-fd48-4245-8a9a-5a80aaa78993/webhook-server/0.log" Feb 18 06:45:29 crc kubenswrapper[4869]: I0218 06:45:29.434504 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v8dwx_f63c8d8a-ba54-4ddf-9105-5f886b7984d9/kube-rbac-proxy/0.log" Feb 18 06:45:29 crc kubenswrapper[4869]: I0218 06:45:29.980885 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v8dwx_f63c8d8a-ba54-4ddf-9105-5f886b7984d9/speaker/0.log" Feb 18 06:45:29 crc kubenswrapper[4869]: I0218 06:45:29.985723 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/frr/0.log" Feb 18 06:45:32 crc kubenswrapper[4869]: I0218 06:45:32.566290 4869 scope.go:117] "RemoveContainer" containerID="3af12797c6f08fcd50bee7de7f7d0c55719b2411fd0f91380d165a9d29f04fe1" Feb 18 06:45:40 crc kubenswrapper[4869]: I0218 06:45:40.133577 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:45:40 crc kubenswrapper[4869]: I0218 06:45:40.134175 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:45:40 crc kubenswrapper[4869]: I0218 06:45:40.134230 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:45:40 crc kubenswrapper[4869]: I0218 06:45:40.135169 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:45:40 crc kubenswrapper[4869]: I0218 06:45:40.135246 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1" gracePeriod=600 Feb 18 06:45:41 crc kubenswrapper[4869]: I0218 06:45:41.031898 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1" exitCode=0 Feb 18 06:45:41 crc kubenswrapper[4869]: I0218 06:45:41.032539 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1"} Feb 18 06:45:41 crc kubenswrapper[4869]: I0218 06:45:41.032566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae"} Feb 18 06:45:41 crc kubenswrapper[4869]: I0218 06:45:41.032581 4869 scope.go:117] "RemoveContainer" containerID="23ac42bb9e62c89fd55b54804cb408e93e69dd4583aadcc6d9de7e5e3e8ecfe6" Feb 18 06:45:42 crc kubenswrapper[4869]: I0218 06:45:42.903851 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.222877 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.238730 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.246823 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.393119 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.399663 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.465930 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/extract/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.579139 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.727232 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.736853 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.747761 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.916377 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:45:43 crc kubenswrapper[4869]: I0218 06:45:43.941685 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.188643 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.310499 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.342051 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.390479 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.458658 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/registry-server/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.640677 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.649206 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.846950 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:45:44 crc kubenswrapper[4869]: I0218 06:45:44.961638 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/registry-server/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.037249 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.056597 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.081855 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.245355 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.264926 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.288686 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/extract/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.466463 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-997j2_89d643e7-cad7-4856-9d82-c0370e1f20e5/marketplace-operator/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.501320 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.651758 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.652395 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.723504 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.875608 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:45:45 crc kubenswrapper[4869]: I0218 06:45:45.912925 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.070671 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/registry-server/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.131118 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.305605 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.329555 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.335395 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.508412 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.510563 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:45:46 crc kubenswrapper[4869]: I0218 06:45:46.978107 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/registry-server/0.log" Feb 18 06:46:16 crc kubenswrapper[4869]: E0218 06:46:16.701267 4869 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.50:48728->38.102.83.50:44207: read tcp 38.102.83.50:48728->38.102.83.50:44207: read: connection reset by peer Feb 18 06:47:31 crc kubenswrapper[4869]: E0218 06:47:31.680428 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37a8150e_3880_4f37_a4e2_57d498278882.slice/crio-conmon-18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c.scope\": RecentStats: unable to find data in memory cache]" Feb 18 06:47:32 crc kubenswrapper[4869]: I0218 06:47:32.091696 4869 generic.go:334] "Generic (PLEG): container finished" podID="37a8150e-3880-4f37-a4e2-57d498278882" containerID="18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c" exitCode=0 Feb 18 06:47:32 crc kubenswrapper[4869]: I0218 06:47:32.091786 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lb6s/must-gather-s4scs" event={"ID":"37a8150e-3880-4f37-a4e2-57d498278882","Type":"ContainerDied","Data":"18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c"} Feb 18 06:47:32 crc kubenswrapper[4869]: I0218 06:47:32.092768 4869 scope.go:117] "RemoveContainer" containerID="18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c" Feb 18 06:47:32 crc kubenswrapper[4869]: I0218 06:47:32.564644 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5lb6s_must-gather-s4scs_37a8150e-3880-4f37-a4e2-57d498278882/gather/0.log" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.001852 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5lb6s/must-gather-s4scs"] Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.003463 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5lb6s/must-gather-s4scs" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="copy" containerID="cri-o://7c7196dc21bef9696b4b78905f9a759846b43fa4d6e0029ce601017856ab7d45" gracePeriod=2 Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.011826 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5lb6s/must-gather-s4scs"] Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.133444 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.133501 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.171660 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5lb6s_must-gather-s4scs_37a8150e-3880-4f37-a4e2-57d498278882/copy/0.log" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.172008 4869 generic.go:334] "Generic (PLEG): container finished" podID="37a8150e-3880-4f37-a4e2-57d498278882" containerID="7c7196dc21bef9696b4b78905f9a759846b43fa4d6e0029ce601017856ab7d45" exitCode=143 Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.458370 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5lb6s_must-gather-s4scs_37a8150e-3880-4f37-a4e2-57d498278882/copy/0.log" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.458789 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.571130 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output\") pod \"37a8150e-3880-4f37-a4e2-57d498278882\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.572823 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqxfw\" (UniqueName: \"kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw\") pod \"37a8150e-3880-4f37-a4e2-57d498278882\" (UID: \"37a8150e-3880-4f37-a4e2-57d498278882\") " Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.579435 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw" (OuterVolumeSpecName: "kube-api-access-wqxfw") pod "37a8150e-3880-4f37-a4e2-57d498278882" (UID: "37a8150e-3880-4f37-a4e2-57d498278882"). InnerVolumeSpecName "kube-api-access-wqxfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.674893 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqxfw\" (UniqueName: \"kubernetes.io/projected/37a8150e-3880-4f37-a4e2-57d498278882-kube-api-access-wqxfw\") on node \"crc\" DevicePath \"\"" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.744535 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "37a8150e-3880-4f37-a4e2-57d498278882" (UID: "37a8150e-3880-4f37-a4e2-57d498278882"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:47:40 crc kubenswrapper[4869]: I0218 06:47:40.776549 4869 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37a8150e-3880-4f37-a4e2-57d498278882-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 06:47:41 crc kubenswrapper[4869]: I0218 06:47:41.182725 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5lb6s_must-gather-s4scs_37a8150e-3880-4f37-a4e2-57d498278882/copy/0.log" Feb 18 06:47:41 crc kubenswrapper[4869]: I0218 06:47:41.183256 4869 scope.go:117] "RemoveContainer" containerID="7c7196dc21bef9696b4b78905f9a759846b43fa4d6e0029ce601017856ab7d45" Feb 18 06:47:41 crc kubenswrapper[4869]: I0218 06:47:41.183316 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lb6s/must-gather-s4scs" Feb 18 06:47:41 crc kubenswrapper[4869]: I0218 06:47:41.204828 4869 scope.go:117] "RemoveContainer" containerID="18fa004024c4100284a351ad80918e63487e8ad2920e7f72307f7195fd58a19c" Feb 18 06:47:41 crc kubenswrapper[4869]: I0218 06:47:41.479983 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a8150e-3880-4f37-a4e2-57d498278882" path="/var/lib/kubelet/pods/37a8150e-3880-4f37-a4e2-57d498278882/volumes" Feb 18 06:48:10 crc kubenswrapper[4869]: I0218 06:48:10.132302 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:48:10 crc kubenswrapper[4869]: I0218 06:48:10.133199 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.132419 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.133079 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.133461 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.134376 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.134460 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" gracePeriod=600 Feb 18 06:48:40 crc kubenswrapper[4869]: E0218 06:48:40.258356 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.709549 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" exitCode=0 Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.709657 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae"} Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.709958 4869 scope.go:117] "RemoveContainer" containerID="500f375524c1e00665538fefd3cad0a388148415b60956f273f733065771d7f1" Feb 18 06:48:40 crc kubenswrapper[4869]: I0218 06:48:40.711022 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:48:40 crc kubenswrapper[4869]: E0218 06:48:40.711512 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:48:53 crc kubenswrapper[4869]: I0218 06:48:53.479380 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:48:53 crc kubenswrapper[4869]: E0218 06:48:53.480830 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:49:04 crc kubenswrapper[4869]: I0218 06:49:04.471319 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:49:04 crc kubenswrapper[4869]: E0218 06:49:04.472128 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:49:18 crc kubenswrapper[4869]: I0218 06:49:18.470286 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:49:18 crc kubenswrapper[4869]: E0218 06:49:18.471200 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:49:32 crc kubenswrapper[4869]: I0218 06:49:32.470209 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:49:32 crc kubenswrapper[4869]: E0218 06:49:32.471377 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:49:44 crc kubenswrapper[4869]: I0218 06:49:44.471635 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:49:44 crc kubenswrapper[4869]: E0218 06:49:44.473449 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:49:55 crc kubenswrapper[4869]: I0218 06:49:55.471197 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:49:55 crc kubenswrapper[4869]: E0218 06:49:55.472452 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:09 crc kubenswrapper[4869]: I0218 06:50:09.470983 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:50:09 crc kubenswrapper[4869]: E0218 06:50:09.472473 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.641655 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:15 crc kubenswrapper[4869]: E0218 06:50:15.642980 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606a64c1-ef3d-4080-ad4d-c2092e6778e6" containerName="collect-profiles" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643006 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="606a64c1-ef3d-4080-ad4d-c2092e6778e6" containerName="collect-profiles" Feb 18 06:50:15 crc kubenswrapper[4869]: E0218 06:50:15.643059 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="gather" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643072 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="gather" Feb 18 06:50:15 crc kubenswrapper[4869]: E0218 06:50:15.643109 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="copy" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643122 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="copy" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643489 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="copy" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643516 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="606a64c1-ef3d-4080-ad4d-c2092e6778e6" containerName="collect-profiles" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.643541 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a8150e-3880-4f37-a4e2-57d498278882" containerName="gather" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.645828 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.659068 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.755862 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.756280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.756412 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxs2\" (UniqueName: \"kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.858627 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.858942 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.859071 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxs2\" (UniqueName: \"kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.859478 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.859489 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:15 crc kubenswrapper[4869]: I0218 06:50:15.877774 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxs2\" (UniqueName: \"kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2\") pod \"community-operators-mbkq5\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.001077 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.378270 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.612061 4869 generic.go:334] "Generic (PLEG): container finished" podID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerID="fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0" exitCode=0 Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.612103 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerDied","Data":"fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0"} Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.612126 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerStarted","Data":"adaa77a1fba0b5a3b7ec1f3256414d38a8414dc5b5cb4498f647b6ecfcb67d4a"} Feb 18 06:50:16 crc kubenswrapper[4869]: I0218 06:50:16.614836 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:50:17 crc kubenswrapper[4869]: I0218 06:50:17.626639 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerStarted","Data":"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18"} Feb 18 06:50:18 crc kubenswrapper[4869]: I0218 06:50:18.635965 4869 generic.go:334] "Generic (PLEG): container finished" podID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerID="0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18" exitCode=0 Feb 18 06:50:18 crc kubenswrapper[4869]: I0218 06:50:18.636079 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerDied","Data":"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18"} Feb 18 06:50:19 crc kubenswrapper[4869]: I0218 06:50:19.647018 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerStarted","Data":"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63"} Feb 18 06:50:19 crc kubenswrapper[4869]: I0218 06:50:19.672235 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbkq5" podStartSLOduration=2.295349078 podStartE2EDuration="4.672215749s" podCreationTimestamp="2026-02-18 06:50:15 +0000 UTC" firstStartedPulling="2026-02-18 06:50:16.614582676 +0000 UTC m=+3713.783670908" lastFinishedPulling="2026-02-18 06:50:18.991449347 +0000 UTC m=+3716.160537579" observedRunningTime="2026-02-18 06:50:19.668333844 +0000 UTC m=+3716.837422086" watchObservedRunningTime="2026-02-18 06:50:19.672215749 +0000 UTC m=+3716.841303981" Feb 18 06:50:20 crc kubenswrapper[4869]: I0218 06:50:20.470485 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:50:20 crc kubenswrapper[4869]: E0218 06:50:20.471028 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.462998 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8bgt/must-gather-6wpzm"] Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.465204 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.467620 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b8bgt"/"default-dockercfg-7b7t7" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.468002 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8bgt"/"openshift-service-ca.crt" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.468209 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8bgt"/"kube-root-ca.crt" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.481619 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8bgt/must-gather-6wpzm"] Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.560262 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vqs4\" (UniqueName: \"kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.561256 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.663833 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.664273 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vqs4\" (UniqueName: \"kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.664279 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.689451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vqs4\" (UniqueName: \"kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4\") pod \"must-gather-6wpzm\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.785777 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b8bgt"/"default-dockercfg-7b7t7" Feb 18 06:50:24 crc kubenswrapper[4869]: I0218 06:50:24.794053 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:50:25 crc kubenswrapper[4869]: I0218 06:50:25.263458 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8bgt/must-gather-6wpzm"] Feb 18 06:50:25 crc kubenswrapper[4869]: I0218 06:50:25.712696 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" event={"ID":"295226a5-0fdf-44b2-aed1-22bef38de348","Type":"ContainerStarted","Data":"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8"} Feb 18 06:50:25 crc kubenswrapper[4869]: I0218 06:50:25.713253 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" event={"ID":"295226a5-0fdf-44b2-aed1-22bef38de348","Type":"ContainerStarted","Data":"2cf96292f45f373fa98edf2f44ad0e2d32de6404cbc45119ac5e989c1747023a"} Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.001802 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.002289 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.049344 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.725816 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" event={"ID":"295226a5-0fdf-44b2-aed1-22bef38de348","Type":"ContainerStarted","Data":"4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b"} Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.790675 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.817178 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" podStartSLOduration=2.817158744 podStartE2EDuration="2.817158744s" podCreationTimestamp="2026-02-18 06:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:50:26.746893721 +0000 UTC m=+3723.915981953" watchObservedRunningTime="2026-02-18 06:50:26.817158744 +0000 UTC m=+3723.986246976" Feb 18 06:50:26 crc kubenswrapper[4869]: I0218 06:50:26.840218 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:28 crc kubenswrapper[4869]: I0218 06:50:28.744138 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbkq5" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="registry-server" containerID="cri-o://c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63" gracePeriod=2 Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.166978 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-f8gcr"] Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.168578 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.200098 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.263493 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities\") pod \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.263869 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxs2\" (UniqueName: \"kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2\") pod \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.264160 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content\") pod \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\" (UID: \"5a002190-f78f-4b2f-8f59-ed69e8a623ba\") " Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.264584 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities" (OuterVolumeSpecName: "utilities") pod "5a002190-f78f-4b2f-8f59-ed69e8a623ba" (UID: "5a002190-f78f-4b2f-8f59-ed69e8a623ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.264770 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.265006 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpn2x\" (UniqueName: \"kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.265163 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.272277 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2" (OuterVolumeSpecName: "kube-api-access-5hxs2") pod "5a002190-f78f-4b2f-8f59-ed69e8a623ba" (UID: "5a002190-f78f-4b2f-8f59-ed69e8a623ba"). InnerVolumeSpecName "kube-api-access-5hxs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.331827 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a002190-f78f-4b2f-8f59-ed69e8a623ba" (UID: "5a002190-f78f-4b2f-8f59-ed69e8a623ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.367157 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.367272 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpn2x\" (UniqueName: \"kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.367281 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.367365 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a002190-f78f-4b2f-8f59-ed69e8a623ba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.367378 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxs2\" (UniqueName: \"kubernetes.io/projected/5a002190-f78f-4b2f-8f59-ed69e8a623ba-kube-api-access-5hxs2\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.384084 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpn2x\" (UniqueName: \"kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x\") pod \"crc-debug-f8gcr\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.510415 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:50:29 crc kubenswrapper[4869]: W0218 06:50:29.539343 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6269ed3b_e273_4176_9b0b_27f4cfa62be6.slice/crio-f7332a94d44c07dcc32eaf8386be87d5adeffa49ea289613f45018eead6bc375 WatchSource:0}: Error finding container f7332a94d44c07dcc32eaf8386be87d5adeffa49ea289613f45018eead6bc375: Status 404 returned error can't find the container with id f7332a94d44c07dcc32eaf8386be87d5adeffa49ea289613f45018eead6bc375 Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.756082 4869 generic.go:334] "Generic (PLEG): container finished" podID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerID="c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63" exitCode=0 Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.756141 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkq5" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.756157 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerDied","Data":"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63"} Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.756544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkq5" event={"ID":"5a002190-f78f-4b2f-8f59-ed69e8a623ba","Type":"ContainerDied","Data":"adaa77a1fba0b5a3b7ec1f3256414d38a8414dc5b5cb4498f647b6ecfcb67d4a"} Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.756565 4869 scope.go:117] "RemoveContainer" containerID="c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.757453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" event={"ID":"6269ed3b-e273-4176-9b0b-27f4cfa62be6","Type":"ContainerStarted","Data":"f7332a94d44c07dcc32eaf8386be87d5adeffa49ea289613f45018eead6bc375"} Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.788811 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.790327 4869 scope.go:117] "RemoveContainer" containerID="0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.796098 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbkq5"] Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.812222 4869 scope.go:117] "RemoveContainer" containerID="fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.832147 4869 scope.go:117] "RemoveContainer" containerID="c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63" Feb 18 06:50:29 crc kubenswrapper[4869]: E0218 06:50:29.832656 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63\": container with ID starting with c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63 not found: ID does not exist" containerID="c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.832699 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63"} err="failed to get container status \"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63\": rpc error: code = NotFound desc = could not find container \"c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63\": container with ID starting with c5e7d4aa4c4a60751db9440e079e3b7f0ec7adc777f144e078a9b8250edfeb63 not found: ID does not exist" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.832725 4869 scope.go:117] "RemoveContainer" containerID="0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18" Feb 18 06:50:29 crc kubenswrapper[4869]: E0218 06:50:29.833192 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18\": container with ID starting with 0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18 not found: ID does not exist" containerID="0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.833230 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18"} err="failed to get container status \"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18\": rpc error: code = NotFound desc = could not find container \"0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18\": container with ID starting with 0661add15ee3454daf74cfbbc818ba00743dc68afa6ee840e39191b00c73ac18 not found: ID does not exist" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.833272 4869 scope.go:117] "RemoveContainer" containerID="fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0" Feb 18 06:50:29 crc kubenswrapper[4869]: E0218 06:50:29.833517 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0\": container with ID starting with fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0 not found: ID does not exist" containerID="fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0" Feb 18 06:50:29 crc kubenswrapper[4869]: I0218 06:50:29.833542 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0"} err="failed to get container status \"fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0\": rpc error: code = NotFound desc = could not find container \"fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0\": container with ID starting with fec4612b7b5d891b4ced38d208ccdfd7ed8d97c4631caeaa80d4a428884bf3b0 not found: ID does not exist" Feb 18 06:50:30 crc kubenswrapper[4869]: I0218 06:50:30.768320 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" event={"ID":"6269ed3b-e273-4176-9b0b-27f4cfa62be6","Type":"ContainerStarted","Data":"e0555d1f98cacd6af6a1ac2cc5e1b09722570eb441c4e788d17076f78c70ee3e"} Feb 18 06:50:30 crc kubenswrapper[4869]: I0218 06:50:30.782797 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" podStartSLOduration=1.7827778909999998 podStartE2EDuration="1.782777891s" podCreationTimestamp="2026-02-18 06:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:50:30.780561687 +0000 UTC m=+3727.949649909" watchObservedRunningTime="2026-02-18 06:50:30.782777891 +0000 UTC m=+3727.951866143" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.470724 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:50:31 crc kubenswrapper[4869]: E0218 06:50:31.470977 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.481052 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" path="/var/lib/kubelet/pods/5a002190-f78f-4b2f-8f59-ed69e8a623ba/volumes" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.715380 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:31 crc kubenswrapper[4869]: E0218 06:50:31.716160 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="registry-server" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.716180 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="registry-server" Feb 18 06:50:31 crc kubenswrapper[4869]: E0218 06:50:31.716202 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="extract-utilities" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.716210 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="extract-utilities" Feb 18 06:50:31 crc kubenswrapper[4869]: E0218 06:50:31.716228 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="extract-content" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.716236 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="extract-content" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.716429 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a002190-f78f-4b2f-8f59-ed69e8a623ba" containerName="registry-server" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.717823 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.747708 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.812613 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.813227 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.813342 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8lc\" (UniqueName: \"kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.916539 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.917095 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.917127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8lc\" (UniqueName: \"kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.917370 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.917645 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:31 crc kubenswrapper[4869]: I0218 06:50:31.946970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8lc\" (UniqueName: \"kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc\") pod \"certified-operators-5pl4w\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:32 crc kubenswrapper[4869]: I0218 06:50:32.067366 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:32 crc kubenswrapper[4869]: W0218 06:50:32.598127 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc99477c0_53cd_40d7_9a1a_bbc64a0e084f.slice/crio-8f09029bf1d4845c425cbdbd2b09fe2e0d1d0d14573db099b21996d67b81f535 WatchSource:0}: Error finding container 8f09029bf1d4845c425cbdbd2b09fe2e0d1d0d14573db099b21996d67b81f535: Status 404 returned error can't find the container with id 8f09029bf1d4845c425cbdbd2b09fe2e0d1d0d14573db099b21996d67b81f535 Feb 18 06:50:32 crc kubenswrapper[4869]: I0218 06:50:32.614118 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:32 crc kubenswrapper[4869]: I0218 06:50:32.803241 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerStarted","Data":"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78"} Feb 18 06:50:32 crc kubenswrapper[4869]: I0218 06:50:32.803284 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerStarted","Data":"8f09029bf1d4845c425cbdbd2b09fe2e0d1d0d14573db099b21996d67b81f535"} Feb 18 06:50:33 crc kubenswrapper[4869]: I0218 06:50:33.817059 4869 generic.go:334] "Generic (PLEG): container finished" podID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerID="0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78" exitCode=0 Feb 18 06:50:33 crc kubenswrapper[4869]: I0218 06:50:33.817163 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerDied","Data":"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78"} Feb 18 06:50:34 crc kubenswrapper[4869]: I0218 06:50:34.827339 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerStarted","Data":"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80"} Feb 18 06:50:36 crc kubenswrapper[4869]: I0218 06:50:36.845801 4869 generic.go:334] "Generic (PLEG): container finished" podID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerID="389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80" exitCode=0 Feb 18 06:50:36 crc kubenswrapper[4869]: I0218 06:50:36.845865 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerDied","Data":"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80"} Feb 18 06:50:38 crc kubenswrapper[4869]: I0218 06:50:38.893951 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerStarted","Data":"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d"} Feb 18 06:50:38 crc kubenswrapper[4869]: I0218 06:50:38.923553 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5pl4w" podStartSLOduration=4.493605268 podStartE2EDuration="7.923530769s" podCreationTimestamp="2026-02-18 06:50:31 +0000 UTC" firstStartedPulling="2026-02-18 06:50:33.822460276 +0000 UTC m=+3730.991548508" lastFinishedPulling="2026-02-18 06:50:37.252385777 +0000 UTC m=+3734.421474009" observedRunningTime="2026-02-18 06:50:38.915815572 +0000 UTC m=+3736.084903804" watchObservedRunningTime="2026-02-18 06:50:38.923530769 +0000 UTC m=+3736.092619001" Feb 18 06:50:42 crc kubenswrapper[4869]: I0218 06:50:42.068305 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:42 crc kubenswrapper[4869]: I0218 06:50:42.068766 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:42 crc kubenswrapper[4869]: I0218 06:50:42.118382 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:44 crc kubenswrapper[4869]: I0218 06:50:44.469804 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:50:44 crc kubenswrapper[4869]: E0218 06:50:44.470372 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:52 crc kubenswrapper[4869]: I0218 06:50:52.120012 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:52 crc kubenswrapper[4869]: I0218 06:50:52.170115 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.007373 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5pl4w" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="registry-server" containerID="cri-o://719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d" gracePeriod=2 Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.454368 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.605277 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8lc\" (UniqueName: \"kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc\") pod \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.605436 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content\") pod \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.605487 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities\") pod \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\" (UID: \"c99477c0-53cd-40d7-9a1a-bbc64a0e084f\") " Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.606801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities" (OuterVolumeSpecName: "utilities") pod "c99477c0-53cd-40d7-9a1a-bbc64a0e084f" (UID: "c99477c0-53cd-40d7-9a1a-bbc64a0e084f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.622617 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc" (OuterVolumeSpecName: "kube-api-access-jc8lc") pod "c99477c0-53cd-40d7-9a1a-bbc64a0e084f" (UID: "c99477c0-53cd-40d7-9a1a-bbc64a0e084f"). InnerVolumeSpecName "kube-api-access-jc8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.667329 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c99477c0-53cd-40d7-9a1a-bbc64a0e084f" (UID: "c99477c0-53cd-40d7-9a1a-bbc64a0e084f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.707908 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8lc\" (UniqueName: \"kubernetes.io/projected/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-kube-api-access-jc8lc\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.707949 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:53 crc kubenswrapper[4869]: I0218 06:50:53.707961 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99477c0-53cd-40d7-9a1a-bbc64a0e084f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.024394 4869 generic.go:334] "Generic (PLEG): container finished" podID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerID="719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d" exitCode=0 Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.024435 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerDied","Data":"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d"} Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.024462 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5pl4w" event={"ID":"c99477c0-53cd-40d7-9a1a-bbc64a0e084f","Type":"ContainerDied","Data":"8f09029bf1d4845c425cbdbd2b09fe2e0d1d0d14573db099b21996d67b81f535"} Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.024612 4869 scope.go:117] "RemoveContainer" containerID="719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.024693 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5pl4w" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.054639 4869 scope.go:117] "RemoveContainer" containerID="389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.062215 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.076025 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5pl4w"] Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.093660 4869 scope.go:117] "RemoveContainer" containerID="0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.130515 4869 scope.go:117] "RemoveContainer" containerID="719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d" Feb 18 06:50:54 crc kubenswrapper[4869]: E0218 06:50:54.131008 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d\": container with ID starting with 719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d not found: ID does not exist" containerID="719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.131046 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d"} err="failed to get container status \"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d\": rpc error: code = NotFound desc = could not find container \"719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d\": container with ID starting with 719c10e0ab53c5cb54806fe65bd1bd8b04f933d6f7f8120e1ec0d18c3650fe6d not found: ID does not exist" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.131076 4869 scope.go:117] "RemoveContainer" containerID="389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80" Feb 18 06:50:54 crc kubenswrapper[4869]: E0218 06:50:54.131384 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80\": container with ID starting with 389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80 not found: ID does not exist" containerID="389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.131410 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80"} err="failed to get container status \"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80\": rpc error: code = NotFound desc = could not find container \"389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80\": container with ID starting with 389f1109012b9c6bc61ac8ba8e6306a709c4b03a243d8922203f1231ded7df80 not found: ID does not exist" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.131429 4869 scope.go:117] "RemoveContainer" containerID="0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78" Feb 18 06:50:54 crc kubenswrapper[4869]: E0218 06:50:54.131676 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78\": container with ID starting with 0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78 not found: ID does not exist" containerID="0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78" Feb 18 06:50:54 crc kubenswrapper[4869]: I0218 06:50:54.131703 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78"} err="failed to get container status \"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78\": rpc error: code = NotFound desc = could not find container \"0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78\": container with ID starting with 0a080854ceeb55d55f3fe9554843436c205b00d0b74c5bc132858b458c61dd78 not found: ID does not exist" Feb 18 06:50:55 crc kubenswrapper[4869]: I0218 06:50:55.470423 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:50:55 crc kubenswrapper[4869]: E0218 06:50:55.470805 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:50:55 crc kubenswrapper[4869]: I0218 06:50:55.483376 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" path="/var/lib/kubelet/pods/c99477c0-53cd-40d7-9a1a-bbc64a0e084f/volumes" Feb 18 06:51:03 crc kubenswrapper[4869]: I0218 06:51:03.100034 4869 generic.go:334] "Generic (PLEG): container finished" podID="6269ed3b-e273-4176-9b0b-27f4cfa62be6" containerID="e0555d1f98cacd6af6a1ac2cc5e1b09722570eb441c4e788d17076f78c70ee3e" exitCode=0 Feb 18 06:51:03 crc kubenswrapper[4869]: I0218 06:51:03.100244 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" event={"ID":"6269ed3b-e273-4176-9b0b-27f4cfa62be6","Type":"ContainerDied","Data":"e0555d1f98cacd6af6a1ac2cc5e1b09722570eb441c4e788d17076f78c70ee3e"} Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.215639 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.245773 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-f8gcr"] Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.252914 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-f8gcr"] Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.407047 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host\") pod \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.407258 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host" (OuterVolumeSpecName: "host") pod "6269ed3b-e273-4176-9b0b-27f4cfa62be6" (UID: "6269ed3b-e273-4176-9b0b-27f4cfa62be6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.407302 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpn2x\" (UniqueName: \"kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x\") pod \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\" (UID: \"6269ed3b-e273-4176-9b0b-27f4cfa62be6\") " Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.407713 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6269ed3b-e273-4176-9b0b-27f4cfa62be6-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.413104 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x" (OuterVolumeSpecName: "kube-api-access-fpn2x") pod "6269ed3b-e273-4176-9b0b-27f4cfa62be6" (UID: "6269ed3b-e273-4176-9b0b-27f4cfa62be6"). InnerVolumeSpecName "kube-api-access-fpn2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:51:04 crc kubenswrapper[4869]: I0218 06:51:04.509773 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpn2x\" (UniqueName: \"kubernetes.io/projected/6269ed3b-e273-4176-9b0b-27f4cfa62be6-kube-api-access-fpn2x\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.120103 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7332a94d44c07dcc32eaf8386be87d5adeffa49ea289613f45018eead6bc375" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.120208 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-f8gcr" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.438528 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-c6xdv"] Feb 18 06:51:05 crc kubenswrapper[4869]: E0218 06:51:05.439300 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6269ed3b-e273-4176-9b0b-27f4cfa62be6" containerName="container-00" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439314 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6269ed3b-e273-4176-9b0b-27f4cfa62be6" containerName="container-00" Feb 18 06:51:05 crc kubenswrapper[4869]: E0218 06:51:05.439325 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="extract-utilities" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439332 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="extract-utilities" Feb 18 06:51:05 crc kubenswrapper[4869]: E0218 06:51:05.439344 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="extract-content" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439350 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="extract-content" Feb 18 06:51:05 crc kubenswrapper[4869]: E0218 06:51:05.439368 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="registry-server" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439374 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="registry-server" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439556 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6269ed3b-e273-4176-9b0b-27f4cfa62be6" containerName="container-00" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.439579 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99477c0-53cd-40d7-9a1a-bbc64a0e084f" containerName="registry-server" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.440397 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.479371 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6269ed3b-e273-4176-9b0b-27f4cfa62be6" path="/var/lib/kubelet/pods/6269ed3b-e273-4176-9b0b-27f4cfa62be6/volumes" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.532595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4l4s\" (UniqueName: \"kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.532705 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.635097 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4l4s\" (UniqueName: \"kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.635280 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.635465 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.654667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4l4s\" (UniqueName: \"kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s\") pod \"crc-debug-c6xdv\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:05 crc kubenswrapper[4869]: I0218 06:51:05.754711 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:06 crc kubenswrapper[4869]: I0218 06:51:06.134871 4869 generic.go:334] "Generic (PLEG): container finished" podID="c4487252-51fe-40a9-aaaa-31f9378667fc" containerID="14ca9aa87a04a581cb32de0920b71ea24ceb35b0d965a533e537dadf8c45b81c" exitCode=0 Feb 18 06:51:06 crc kubenswrapper[4869]: I0218 06:51:06.135034 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" event={"ID":"c4487252-51fe-40a9-aaaa-31f9378667fc","Type":"ContainerDied","Data":"14ca9aa87a04a581cb32de0920b71ea24ceb35b0d965a533e537dadf8c45b81c"} Feb 18 06:51:06 crc kubenswrapper[4869]: I0218 06:51:06.135217 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" event={"ID":"c4487252-51fe-40a9-aaaa-31f9378667fc","Type":"ContainerStarted","Data":"cb9348e92f6ef045129414d573c211d6b3b03a0955f8fe00920f32bb4ee30f32"} Feb 18 06:51:06 crc kubenswrapper[4869]: I0218 06:51:06.524743 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-c6xdv"] Feb 18 06:51:06 crc kubenswrapper[4869]: I0218 06:51:06.533514 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-c6xdv"] Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.246245 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.361908 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4l4s\" (UniqueName: \"kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s\") pod \"c4487252-51fe-40a9-aaaa-31f9378667fc\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.361966 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host\") pod \"c4487252-51fe-40a9-aaaa-31f9378667fc\" (UID: \"c4487252-51fe-40a9-aaaa-31f9378667fc\") " Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.362295 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host" (OuterVolumeSpecName: "host") pod "c4487252-51fe-40a9-aaaa-31f9378667fc" (UID: "c4487252-51fe-40a9-aaaa-31f9378667fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.362669 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4487252-51fe-40a9-aaaa-31f9378667fc-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.373392 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s" (OuterVolumeSpecName: "kube-api-access-b4l4s") pod "c4487252-51fe-40a9-aaaa-31f9378667fc" (UID: "c4487252-51fe-40a9-aaaa-31f9378667fc"). InnerVolumeSpecName "kube-api-access-b4l4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.464779 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4l4s\" (UniqueName: \"kubernetes.io/projected/c4487252-51fe-40a9-aaaa-31f9378667fc-kube-api-access-b4l4s\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.479517 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4487252-51fe-40a9-aaaa-31f9378667fc" path="/var/lib/kubelet/pods/c4487252-51fe-40a9-aaaa-31f9378667fc/volumes" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.739682 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-n2j6d"] Feb 18 06:51:07 crc kubenswrapper[4869]: E0218 06:51:07.740219 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4487252-51fe-40a9-aaaa-31f9378667fc" containerName="container-00" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.740235 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4487252-51fe-40a9-aaaa-31f9378667fc" containerName="container-00" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.740449 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4487252-51fe-40a9-aaaa-31f9378667fc" containerName="container-00" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.741173 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.871721 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gxxv\" (UniqueName: \"kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.871813 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.973818 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gxxv\" (UniqueName: \"kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.973908 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.974021 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:07 crc kubenswrapper[4869]: I0218 06:51:07.990805 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gxxv\" (UniqueName: \"kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv\") pod \"crc-debug-n2j6d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:08 crc kubenswrapper[4869]: I0218 06:51:08.058222 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:08 crc kubenswrapper[4869]: W0218 06:51:08.082669 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d6bcd99_710f_4171_83bc_833811c2b97d.slice/crio-dc394d5e175230dad3beb4123a7c813c08e70db6c4592bd99493c5424e745dec WatchSource:0}: Error finding container dc394d5e175230dad3beb4123a7c813c08e70db6c4592bd99493c5424e745dec: Status 404 returned error can't find the container with id dc394d5e175230dad3beb4123a7c813c08e70db6c4592bd99493c5424e745dec Feb 18 06:51:08 crc kubenswrapper[4869]: I0218 06:51:08.151909 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" event={"ID":"6d6bcd99-710f-4171-83bc-833811c2b97d","Type":"ContainerStarted","Data":"dc394d5e175230dad3beb4123a7c813c08e70db6c4592bd99493c5424e745dec"} Feb 18 06:51:08 crc kubenswrapper[4869]: I0218 06:51:08.153784 4869 scope.go:117] "RemoveContainer" containerID="14ca9aa87a04a581cb32de0920b71ea24ceb35b0d965a533e537dadf8c45b81c" Feb 18 06:51:08 crc kubenswrapper[4869]: I0218 06:51:08.153821 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-c6xdv" Feb 18 06:51:08 crc kubenswrapper[4869]: I0218 06:51:08.470338 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:51:08 crc kubenswrapper[4869]: E0218 06:51:08.470674 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:51:09 crc kubenswrapper[4869]: I0218 06:51:09.163503 4869 generic.go:334] "Generic (PLEG): container finished" podID="6d6bcd99-710f-4171-83bc-833811c2b97d" containerID="787bfd89920d8e35ec74e0a252d1227db2f849c22d24d19bae1e61bb1b3e84d1" exitCode=0 Feb 18 06:51:09 crc kubenswrapper[4869]: I0218 06:51:09.163605 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" event={"ID":"6d6bcd99-710f-4171-83bc-833811c2b97d","Type":"ContainerDied","Data":"787bfd89920d8e35ec74e0a252d1227db2f849c22d24d19bae1e61bb1b3e84d1"} Feb 18 06:51:09 crc kubenswrapper[4869]: I0218 06:51:09.208396 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-n2j6d"] Feb 18 06:51:09 crc kubenswrapper[4869]: I0218 06:51:09.221991 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8bgt/crc-debug-n2j6d"] Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.270581 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.415092 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gxxv\" (UniqueName: \"kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv\") pod \"6d6bcd99-710f-4171-83bc-833811c2b97d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.415236 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host\") pod \"6d6bcd99-710f-4171-83bc-833811c2b97d\" (UID: \"6d6bcd99-710f-4171-83bc-833811c2b97d\") " Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.415439 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host" (OuterVolumeSpecName: "host") pod "6d6bcd99-710f-4171-83bc-833811c2b97d" (UID: "6d6bcd99-710f-4171-83bc-833811c2b97d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.416089 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d6bcd99-710f-4171-83bc-833811c2b97d-host\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.420603 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv" (OuterVolumeSpecName: "kube-api-access-9gxxv") pod "6d6bcd99-710f-4171-83bc-833811c2b97d" (UID: "6d6bcd99-710f-4171-83bc-833811c2b97d"). InnerVolumeSpecName "kube-api-access-9gxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:51:10 crc kubenswrapper[4869]: I0218 06:51:10.517608 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gxxv\" (UniqueName: \"kubernetes.io/projected/6d6bcd99-710f-4171-83bc-833811c2b97d-kube-api-access-9gxxv\") on node \"crc\" DevicePath \"\"" Feb 18 06:51:11 crc kubenswrapper[4869]: I0218 06:51:11.181289 4869 scope.go:117] "RemoveContainer" containerID="787bfd89920d8e35ec74e0a252d1227db2f849c22d24d19bae1e61bb1b3e84d1" Feb 18 06:51:11 crc kubenswrapper[4869]: I0218 06:51:11.181686 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/crc-debug-n2j6d" Feb 18 06:51:11 crc kubenswrapper[4869]: I0218 06:51:11.479980 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6bcd99-710f-4171-83bc-833811c2b97d" path="/var/lib/kubelet/pods/6d6bcd99-710f-4171-83bc-833811c2b97d/volumes" Feb 18 06:51:21 crc kubenswrapper[4869]: I0218 06:51:21.469962 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:51:21 crc kubenswrapper[4869]: E0218 06:51:21.470666 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:51:34 crc kubenswrapper[4869]: I0218 06:51:34.469713 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:51:34 crc kubenswrapper[4869]: E0218 06:51:34.471561 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.050062 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85cc9b8698-mzcgf_00e28946-38e8-4b00-8181-d45908ad9863/barbican-api/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.143309 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85cc9b8698-mzcgf_00e28946-38e8-4b00-8181-d45908ad9863/barbican-api-log/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.273906 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5884488646-54ch2_4ed72952-b72a-4f66-8e63-84d18936ff3a/barbican-keystone-listener/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.288233 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5884488646-54ch2_4ed72952-b72a-4f66-8e63-84d18936ff3a/barbican-keystone-listener-log/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.438144 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98f68fcf5-pz8rv_c3b79627-ea10-4a59-a5ae-f24d3ace238d/barbican-worker-log/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.441657 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-98f68fcf5-pz8rv_c3b79627-ea10-4a59-a5ae-f24d3ace238d/barbican-worker/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.645677 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hkxxw_26716094-10bf-4523-9c23-674dd4b7d517/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.648954 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/ceilometer-central-agent/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.716498 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/ceilometer-notification-agent/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.818599 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/proxy-httpd/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.857571 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0226ffe-db7d-48b2-acee-8a9f7045c083/sg-core/0.log" Feb 18 06:51:40 crc kubenswrapper[4869]: I0218 06:51:40.978086 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e110f65d-fa29-4024-a0d8-352543bd0c1b/cinder-api/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.084474 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e110f65d-fa29-4024-a0d8-352543bd0c1b/cinder-api-log/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.128830 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c911137c-7aa9-4875-ae81-e91caebd828a/cinder-scheduler/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.253524 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c911137c-7aa9-4875-ae81-e91caebd828a/probe/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.374279 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8zlzn_1475668a-1132-4548-a5e6-0f4a459480c1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.466418 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6dg7f_8ab4b789-eeaf-4e68-b947-436fc6f6bafa/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.580394 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/init/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.776727 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/dnsmasq-dns/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.782789 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-n44p5_57134f39-764c-4164-a5ab-9392660d554b/init/0.log" Feb 18 06:51:41 crc kubenswrapper[4869]: I0218 06:51:41.810645 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7q4cn_102527af-43b3-4260-bdbf-cd653b203986/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.141324 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_469d9354-4653-4a99-b457-3b453082e0e0/glance-httpd/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.208442 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_469d9354-4653-4a99-b457-3b453082e0e0/glance-log/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.339206 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1dfb852b-63ab-46ce-8f5b-3c6be7b02400/glance-httpd/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.370154 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1dfb852b-63ab-46ce-8f5b-3c6be7b02400/glance-log/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.502774 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57f5fddd88-qhh5n_391d8fe4-58ea-434e-918f-811b7c3e14b2/horizon/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.644211 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bdr55_9e02b084-943e-4579-87f9-6a0cdff0d8c1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.841720 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cctjr_c1b93caa-11f6-4841-b63c-6542711f26cc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:42 crc kubenswrapper[4869]: I0218 06:51:42.911115 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57f5fddd88-qhh5n_391d8fe4-58ea-434e-918f-811b7c3e14b2/horizon-log/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.132514 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ecd7b730-0094-40d2-9894-d90d45f8c2de/kube-state-metrics/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.147988 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-666896fcd4-c65vb_e56418e8-0afb-47a6-9064-ff0a381ef2ba/keystone-api/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.406835 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jrx4w_162af8f6-3123-4d8b-a602-0b2808cd6654/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.722293 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4594595c-zvnfb_dffbc8b7-8080-4958-915d-ee66f5ae732b/neutron-api/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.747300 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-78vdw_4c96cef9-45b8-4639-a368-063acac72c83/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:43 crc kubenswrapper[4869]: I0218 06:51:43.751957 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d4594595c-zvnfb_dffbc8b7-8080-4958-915d-ee66f5ae732b/neutron-httpd/0.log" Feb 18 06:51:44 crc kubenswrapper[4869]: I0218 06:51:44.313182 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1b848f50-bd92-4c86-8f5e-64c4fd3e2521/nova-api-log/0.log" Feb 18 06:51:44 crc kubenswrapper[4869]: I0218 06:51:44.464718 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eabb4471-00e8-4edb-9128-249b4057d5d7/nova-cell0-conductor-conductor/0.log" Feb 18 06:51:44 crc kubenswrapper[4869]: I0218 06:51:44.634576 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4662bd21-e3c5-4980-bac5-dc1f76c958c3/nova-cell1-conductor-conductor/0.log" Feb 18 06:51:44 crc kubenswrapper[4869]: I0218 06:51:44.798344 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1b848f50-bd92-4c86-8f5e-64c4fd3e2521/nova-api-api/0.log" Feb 18 06:51:44 crc kubenswrapper[4869]: I0218 06:51:44.872522 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_50575a1a-5d98-4692-b1ca-d275e90c6fed/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.049470 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8p8x4_7d30bdae-b0f6-49aa-b343-c2b9abc186ba/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.258764 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f/nova-metadata-log/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.449049 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/mysql-bootstrap/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.486717 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f31fe572-a7a1-48f4-8ebb-e621788c2456/nova-scheduler-scheduler/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.739175 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/mysql-bootstrap/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.763453 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_23dc38a3-9ce0-4f1f-9495-2dc65f2474e5/galera/0.log" Feb 18 06:51:45 crc kubenswrapper[4869]: I0218 06:51:45.955968 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/mysql-bootstrap/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.083931 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/mysql-bootstrap/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.155700 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32064888-24ad-482d-ba16-36bfb48b069e/galera/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.336850 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_348bdc65-0fd0-4870-adfc-d0d69a51e762/openstackclient/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.403616 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eec51a6e-eb15-4c31-bb5a-c3aa1eb81e5f/nova-metadata-metadata/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.448562 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6zzxt_0b85434e-56f8-4cab-91a5-8cf0ea0356fc/ovn-controller/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.582429 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-47kkq_7cb275de-f7e9-434d-b934-37dfb39e92ac/openstack-network-exporter/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.668710 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server-init/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.918697 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server/0.log" Feb 18 06:51:46 crc kubenswrapper[4869]: I0218 06:51:46.952998 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovsdb-server-init/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.017181 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5czp2_8e4e9056-1f05-4fc5-b1a1-e578abbc24c6/ovs-vswitchd/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.150856 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kqrfk_b8020d5a-e997-4376-bef7-488e40f51277/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.158917 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65ad1fc9-3393-4e57-9041-c17ef5279ddd/openstack-network-exporter/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.259869 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65ad1fc9-3393-4e57-9041-c17ef5279ddd/ovn-northd/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.318692 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_25b0cd0b-8d96-4067-a1da-171e5f0b9545/openstack-network-exporter/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.475766 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:51:47 crc kubenswrapper[4869]: E0218 06:51:47.476287 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.650161 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_25b0cd0b-8d96-4067-a1da-171e5f0b9545/ovsdbserver-nb/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.686317 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0991303d-6180-44e1-9baa-88ece3cdbfaf/openstack-network-exporter/0.log" Feb 18 06:51:47 crc kubenswrapper[4869]: I0218 06:51:47.716097 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0991303d-6180-44e1-9baa-88ece3cdbfaf/ovsdbserver-sb/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.011858 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55b8768b96-mwv6g_965b4f29-cb41-4066-a9e6-3729ec43b2bd/placement-api/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.071145 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55b8768b96-mwv6g_965b4f29-cb41-4066-a9e6-3729ec43b2bd/placement-log/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.131721 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/setup-container/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.385244 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/setup-container/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.395500 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eb17f4cc-a879-4fb2-be2e-4e0e47167746/rabbitmq/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.434496 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/setup-container/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.724249 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/setup-container/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.767773 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-x9spz_4d6656f7-173a-4e9e-b802-6547876438ec/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:48 crc kubenswrapper[4869]: I0218 06:51:48.827920 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35973c92-2b94-4366-aa4b-637920311279/rabbitmq/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.006147 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-h9skj_93c9d860-2ea7-4a81-b383-aae67501c7f8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.084101 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cnsv8_a49b5bd1-87a9-4536-b8fb-5f32f8024b8a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.238574 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d4jbn_d4b7d5ea-dca6-4f74-8143-17a7573402d3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.329371 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rbl5_76fa3bfe-8da3-4ceb-95c6-de5473957a3e/ssh-known-hosts-edpm-deployment/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.602808 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-846bf8ff8c-7j4wb_23e9d7e3-bcc7-493e-84a6-e646ab36e6f0/proxy-server/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.619604 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-846bf8ff8c-7j4wb_23e9d7e3-bcc7-493e-84a6-e646ab36e6f0/proxy-httpd/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.706261 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vc4l8_0cb2f895-3d57-468e-8197-636fcc33afe4/swift-ring-rebalance/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.872853 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-auditor/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.873220 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-reaper/0.log" Feb 18 06:51:49 crc kubenswrapper[4869]: I0218 06:51:49.967577 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-replicator/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.033404 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/account-server/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.113936 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-replicator/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.151140 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-server/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.177731 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-auditor/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.264663 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/container-updater/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.402051 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-auditor/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.409189 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-expirer/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.462383 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-server/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.485895 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-replicator/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.580474 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/object-updater/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.656177 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/rsync/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.751813 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_253fcb12-2d2f-4cac-bd2c-aa2dc0fb6624/swift-recon-cron/0.log" Feb 18 06:51:50 crc kubenswrapper[4869]: I0218 06:51:50.886270 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dmdd5_bbec6484-4b0d-477a-832a-9fb69ce89f4a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:51 crc kubenswrapper[4869]: I0218 06:51:51.123447 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4450687d-212c-4577-9511-05a7f072b274/tempest-tests-tempest-tests-runner/0.log" Feb 18 06:51:51 crc kubenswrapper[4869]: I0218 06:51:51.264483 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_498e7aa0-bb28-4bf6-919c-fef01def7f3d/test-operator-logs-container/0.log" Feb 18 06:51:51 crc kubenswrapper[4869]: I0218 06:51:51.309130 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-52cjm_eeb5893c-dc4f-4cb4-b55b-007c03e03889/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 06:51:59 crc kubenswrapper[4869]: I0218 06:51:59.585223 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a6c89f7e-0259-4c42-9e24-cd8391cda1a3/memcached/0.log" Feb 18 06:52:00 crc kubenswrapper[4869]: I0218 06:52:00.469924 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:52:00 crc kubenswrapper[4869]: E0218 06:52:00.470425 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:52:15 crc kubenswrapper[4869]: I0218 06:52:15.470682 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:52:15 crc kubenswrapper[4869]: E0218 06:52:15.471516 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:52:16 crc kubenswrapper[4869]: I0218 06:52:16.824690 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.046385 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.095091 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.138640 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.329618 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/util/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.384793 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/pull/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.420774 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fx7nk4_a65bf9b8-ddf3-4a68-8dfd-fa484987b27b/extract/0.log" Feb 18 06:52:17 crc kubenswrapper[4869]: I0218 06:52:17.773352 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-8gxgm_77f20e81-cc4d-44ab-9f77-40080cc392ec/manager/0.log" Feb 18 06:52:18 crc kubenswrapper[4869]: I0218 06:52:18.318382 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-gfxvp_02aea0c3-b59c-41dd-9c48-514fd4bfa94c/manager/0.log" Feb 18 06:52:18 crc kubenswrapper[4869]: I0218 06:52:18.499534 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-s9l2b_65820ad0-cf24-499c-b418-8980edb8788a/manager/0.log" Feb 18 06:52:18 crc kubenswrapper[4869]: I0218 06:52:18.898330 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-n5zdw_4a638516-be5b-4a24-9d1a-cc5dbcaac3ed/manager/0.log" Feb 18 06:52:19 crc kubenswrapper[4869]: I0218 06:52:19.640900 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-55txz_da28bc19-c4b2-4d9a-8357-6ce9680567ce/manager/0.log" Feb 18 06:52:19 crc kubenswrapper[4869]: I0218 06:52:19.645886 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-fnrx8_ea6c026b-8825-42ec-8b66-9c2842957c10/manager/0.log" Feb 18 06:52:19 crc kubenswrapper[4869]: I0218 06:52:19.826666 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-knzcc_e5345d15-54e7-4c42-92d2-e3f4d63e9533/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.042632 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-btm9t_3519f676-e828-4ec9-8995-ecf778e36d4f/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.080589 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-5t2ll_3627c187-4d3b-49cb-9367-5758e676b0af/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.274864 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-sbq48_835a35ac-1347-46f7-ae71-aa38e8aea7cf/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.490614 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-drwmx_cbb0292f-e15f-4a01-bd91-1c155779be07/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.638895 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-vfs4v_269fa527-4152-4014-b070-7e651d5f7b2f/manager/0.log" Feb 18 06:52:20 crc kubenswrapper[4869]: I0218 06:52:20.977126 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c4sdfv_8fdd1d16-0b06-4553-b43a-943fb22f8961/manager/0.log" Feb 18 06:52:21 crc kubenswrapper[4869]: I0218 06:52:21.462214 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-766dc4fc6-rtp2x_a795b61b-e61c-46e5-a72e-e64ca8421756/operator/0.log" Feb 18 06:52:21 crc kubenswrapper[4869]: I0218 06:52:21.684337 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-crznf_5f3e331a-a2a6-4bd2-adce-f586154b805c/registry-server/0.log" Feb 18 06:52:21 crc kubenswrapper[4869]: I0218 06:52:21.906803 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-gbx94_e90352c9-520d-40dc-b9f6-3919a8bd67fb/manager/0.log" Feb 18 06:52:22 crc kubenswrapper[4869]: I0218 06:52:22.130210 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-47sw8_e2de3218-8c57-43ab-b45e-e69a92456549/manager/0.log" Feb 18 06:52:22 crc kubenswrapper[4869]: I0218 06:52:22.311592 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wfxk8_9d84cc45-ca42-454f-9323-35d717ea7cd4/operator/0.log" Feb 18 06:52:22 crc kubenswrapper[4869]: I0218 06:52:22.535193 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-86mkx_f954d2f9-baf9-4d98-bee1-05598035e3a1/manager/0.log" Feb 18 06:52:22 crc kubenswrapper[4869]: I0218 06:52:22.882183 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-xkrff_63e77cf1-d554-4e43-a6e0-93e671cc90fc/manager/0.log" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.032524 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-6nn8b_c7070d2d-1fcc-4ae8-9380-d0f500c95d01/manager/0.log" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.112293 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:23 crc kubenswrapper[4869]: E0218 06:52:23.112703 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6bcd99-710f-4171-83bc-833811c2b97d" containerName="container-00" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.112720 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6bcd99-710f-4171-83bc-833811c2b97d" containerName="container-00" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.113078 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6bcd99-710f-4171-83bc-833811c2b97d" containerName="container-00" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.114315 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.118608 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-dccc9b448-nffdh_3990868e-7ca4-439b-a244-6a3336628877/manager/0.log" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.123982 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.158287 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.158337 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2nj\" (UniqueName: \"kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.158460 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.172355 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-2ctsx_78799685-a70e-4b5d-ae0f-fbd4ac1f48fd/manager/0.log" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.241317 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-t6pg6_fe9a7273-de20-4420-8335-dc291458c338/manager/0.log" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.259890 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.259988 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.260024 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2nj\" (UniqueName: \"kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.260359 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.260402 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.281844 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2nj\" (UniqueName: \"kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj\") pod \"redhat-marketplace-7g26w\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:23 crc kubenswrapper[4869]: I0218 06:52:23.449241 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:24 crc kubenswrapper[4869]: I0218 06:52:24.059007 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:24 crc kubenswrapper[4869]: W0218 06:52:24.063012 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626b1ed5_4611_4f82_9391_45c351c3998e.slice/crio-f849a81e884d61df5d4e113c79c060cfcbb5a5c0e3458a3e2f7a5de32769dc4b WatchSource:0}: Error finding container f849a81e884d61df5d4e113c79c060cfcbb5a5c0e3458a3e2f7a5de32769dc4b: Status 404 returned error can't find the container with id f849a81e884d61df5d4e113c79c060cfcbb5a5c0e3458a3e2f7a5de32769dc4b Feb 18 06:52:24 crc kubenswrapper[4869]: I0218 06:52:24.813805 4869 generic.go:334] "Generic (PLEG): container finished" podID="626b1ed5-4611-4f82-9391-45c351c3998e" containerID="e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48" exitCode=0 Feb 18 06:52:24 crc kubenswrapper[4869]: I0218 06:52:24.814417 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerDied","Data":"e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48"} Feb 18 06:52:24 crc kubenswrapper[4869]: I0218 06:52:24.814445 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerStarted","Data":"f849a81e884d61df5d4e113c79c060cfcbb5a5c0e3458a3e2f7a5de32769dc4b"} Feb 18 06:52:25 crc kubenswrapper[4869]: I0218 06:52:25.827244 4869 generic.go:334] "Generic (PLEG): container finished" podID="626b1ed5-4611-4f82-9391-45c351c3998e" containerID="b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5" exitCode=0 Feb 18 06:52:25 crc kubenswrapper[4869]: I0218 06:52:25.827577 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerDied","Data":"b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5"} Feb 18 06:52:26 crc kubenswrapper[4869]: I0218 06:52:26.469932 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:52:26 crc kubenswrapper[4869]: E0218 06:52:26.470455 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:52:26 crc kubenswrapper[4869]: I0218 06:52:26.600728 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-q5gnw_c1a14efa-4a9b-49b6-a882-c0d080269850/manager/0.log" Feb 18 06:52:26 crc kubenswrapper[4869]: I0218 06:52:26.842987 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerStarted","Data":"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01"} Feb 18 06:52:26 crc kubenswrapper[4869]: I0218 06:52:26.863817 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7g26w" podStartSLOduration=2.436721049 podStartE2EDuration="3.863795383s" podCreationTimestamp="2026-02-18 06:52:23 +0000 UTC" firstStartedPulling="2026-02-18 06:52:24.817431927 +0000 UTC m=+3841.986520159" lastFinishedPulling="2026-02-18 06:52:26.244506271 +0000 UTC m=+3843.413594493" observedRunningTime="2026-02-18 06:52:26.856373603 +0000 UTC m=+3844.025461835" watchObservedRunningTime="2026-02-18 06:52:26.863795383 +0000 UTC m=+3844.032883615" Feb 18 06:52:33 crc kubenswrapper[4869]: I0218 06:52:33.451093 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:33 crc kubenswrapper[4869]: I0218 06:52:33.451731 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:33 crc kubenswrapper[4869]: I0218 06:52:33.499870 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:33 crc kubenswrapper[4869]: I0218 06:52:33.943443 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:33 crc kubenswrapper[4869]: I0218 06:52:33.992369 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:35 crc kubenswrapper[4869]: I0218 06:52:35.915701 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7g26w" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="registry-server" containerID="cri-o://2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01" gracePeriod=2 Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.453472 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.570084 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content\") pod \"626b1ed5-4611-4f82-9391-45c351c3998e\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.570159 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities\") pod \"626b1ed5-4611-4f82-9391-45c351c3998e\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.570236 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p2nj\" (UniqueName: \"kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj\") pod \"626b1ed5-4611-4f82-9391-45c351c3998e\" (UID: \"626b1ed5-4611-4f82-9391-45c351c3998e\") " Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.572243 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities" (OuterVolumeSpecName: "utilities") pod "626b1ed5-4611-4f82-9391-45c351c3998e" (UID: "626b1ed5-4611-4f82-9391-45c351c3998e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.581217 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj" (OuterVolumeSpecName: "kube-api-access-6p2nj") pod "626b1ed5-4611-4f82-9391-45c351c3998e" (UID: "626b1ed5-4611-4f82-9391-45c351c3998e"). InnerVolumeSpecName "kube-api-access-6p2nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.603871 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "626b1ed5-4611-4f82-9391-45c351c3998e" (UID: "626b1ed5-4611-4f82-9391-45c351c3998e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.673003 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.673038 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626b1ed5-4611-4f82-9391-45c351c3998e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.673048 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p2nj\" (UniqueName: \"kubernetes.io/projected/626b1ed5-4611-4f82-9391-45c351c3998e-kube-api-access-6p2nj\") on node \"crc\" DevicePath \"\"" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.928579 4869 generic.go:334] "Generic (PLEG): container finished" podID="626b1ed5-4611-4f82-9391-45c351c3998e" containerID="2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01" exitCode=0 Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.929487 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerDied","Data":"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01"} Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.929993 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7g26w" event={"ID":"626b1ed5-4611-4f82-9391-45c351c3998e","Type":"ContainerDied","Data":"f849a81e884d61df5d4e113c79c060cfcbb5a5c0e3458a3e2f7a5de32769dc4b"} Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.930028 4869 scope.go:117] "RemoveContainer" containerID="2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.929551 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7g26w" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.950658 4869 scope.go:117] "RemoveContainer" containerID="b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.967908 4869 scope.go:117] "RemoveContainer" containerID="e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48" Feb 18 06:52:36 crc kubenswrapper[4869]: I0218 06:52:36.990055 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.001632 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7g26w"] Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.016522 4869 scope.go:117] "RemoveContainer" containerID="2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01" Feb 18 06:52:37 crc kubenswrapper[4869]: E0218 06:52:37.017309 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01\": container with ID starting with 2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01 not found: ID does not exist" containerID="2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.017352 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01"} err="failed to get container status \"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01\": rpc error: code = NotFound desc = could not find container \"2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01\": container with ID starting with 2569d014f19b2c53ccc54ff213bfae94f91bf1b9cf18d8a0e343ba2a7287bd01 not found: ID does not exist" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.017374 4869 scope.go:117] "RemoveContainer" containerID="b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5" Feb 18 06:52:37 crc kubenswrapper[4869]: E0218 06:52:37.018255 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5\": container with ID starting with b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5 not found: ID does not exist" containerID="b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.018298 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5"} err="failed to get container status \"b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5\": rpc error: code = NotFound desc = could not find container \"b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5\": container with ID starting with b76f0a9ee84491c73f48634f5984ca2f1407521e2e7f08116ba79acae79603b5 not found: ID does not exist" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.018314 4869 scope.go:117] "RemoveContainer" containerID="e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48" Feb 18 06:52:37 crc kubenswrapper[4869]: E0218 06:52:37.018640 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48\": container with ID starting with e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48 not found: ID does not exist" containerID="e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.018661 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48"} err="failed to get container status \"e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48\": rpc error: code = NotFound desc = could not find container \"e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48\": container with ID starting with e70f77944f2c7e5fb9c70bb4f40fa8156ad76402f93ab7fb0b6553eefe159a48 not found: ID does not exist" Feb 18 06:52:37 crc kubenswrapper[4869]: I0218 06:52:37.482184 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" path="/var/lib/kubelet/pods/626b1ed5-4611-4f82-9391-45c351c3998e/volumes" Feb 18 06:52:40 crc kubenswrapper[4869]: I0218 06:52:40.470308 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:52:40 crc kubenswrapper[4869]: E0218 06:52:40.470951 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:52:44 crc kubenswrapper[4869]: I0218 06:52:44.563679 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pmqtp_be5b136e-e664-4ed5-9fbd-e2a9bdd06db9/control-plane-machine-set-operator/0.log" Feb 18 06:52:44 crc kubenswrapper[4869]: I0218 06:52:44.697070 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lljlj_7411fb2b-cd62-452d-a5c8-94135752329d/kube-rbac-proxy/0.log" Feb 18 06:52:44 crc kubenswrapper[4869]: I0218 06:52:44.775582 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lljlj_7411fb2b-cd62-452d-a5c8-94135752329d/machine-api-operator/0.log" Feb 18 06:52:53 crc kubenswrapper[4869]: I0218 06:52:53.476136 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:52:53 crc kubenswrapper[4869]: E0218 06:52:53.477051 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:52:57 crc kubenswrapper[4869]: I0218 06:52:57.188630 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-xc26m_2742a7dc-644d-4ede-be60-c014ffd5ad38/cert-manager-controller/0.log" Feb 18 06:52:57 crc kubenswrapper[4869]: I0218 06:52:57.419476 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v52pk_6b66e86e-81f9-44f2-b711-ad17fc1504a6/cert-manager-cainjector/0.log" Feb 18 06:52:57 crc kubenswrapper[4869]: I0218 06:52:57.487212 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zvwk2_6e6c32f7-24fc-4637-9607-bef3c0d85bb7/cert-manager-webhook/0.log" Feb 18 06:53:08 crc kubenswrapper[4869]: I0218 06:53:08.470533 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:53:08 crc kubenswrapper[4869]: E0218 06:53:08.471207 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.080144 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-x7pfd_babd7df7-75bc-470a-a960-c1d0317f2f8e/nmstate-console-plugin/0.log" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.281319 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kljbt_02b9ed08-af4d-434a-8042-9b4acedf423c/nmstate-handler/0.log" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.312013 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-tkd86_9c59b69a-72a6-4ce0-9b47-c53016b5ac3a/nmstate-metrics/0.log" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.325763 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-tkd86_9c59b69a-72a6-4ce0-9b47-c53016b5ac3a/kube-rbac-proxy/0.log" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.488760 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-mfnzm_851e9bf0-c3bc-454d-a3e3-ade0dd734f5a/nmstate-operator/0.log" Feb 18 06:53:09 crc kubenswrapper[4869]: I0218 06:53:09.598116 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-j5g4n_1582a9d4-d3f8-4ca1-8853-6e6e8cc10d92/nmstate-webhook/0.log" Feb 18 06:53:23 crc kubenswrapper[4869]: I0218 06:53:23.476337 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:53:23 crc kubenswrapper[4869]: E0218 06:53:23.477161 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.128800 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:35 crc kubenswrapper[4869]: E0218 06:53:35.129544 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="extract-utilities" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.129555 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="extract-utilities" Feb 18 06:53:35 crc kubenswrapper[4869]: E0218 06:53:35.129568 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="extract-content" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.129575 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="extract-content" Feb 18 06:53:35 crc kubenswrapper[4869]: E0218 06:53:35.129587 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="registry-server" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.129592 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="registry-server" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.129786 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="626b1ed5-4611-4f82-9391-45c351c3998e" containerName="registry-server" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.130950 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.137880 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.307936 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.308231 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.308373 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ddd\" (UniqueName: \"kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.409642 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.409722 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ddd\" (UniqueName: \"kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.409796 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.410142 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.410207 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.445664 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ddd\" (UniqueName: \"kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd\") pod \"redhat-operators-dv525\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.468180 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.631900 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sh2wz_e23bbfea-0160-46be-ae71-7ff977953af2/controller/0.log" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.711478 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sh2wz_e23bbfea-0160-46be-ae71-7ff977953af2/kube-rbac-proxy/0.log" Feb 18 06:53:35 crc kubenswrapper[4869]: I0218 06:53:35.858133 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.012820 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.094426 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.131387 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.171336 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.193815 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.401968 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.409899 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.410283 4869 generic.go:334] "Generic (PLEG): container finished" podID="53f145d2-6113-4580-8c49-2b73f4f13d68" containerID="e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71" exitCode=0 Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.410422 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerDied","Data":"e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71"} Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.410544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerStarted","Data":"9480655eeda442d85f9f1100b43bf378de430ac243dbdfe731bf13275e0da451"} Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.447455 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.447477 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.652204 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-frr-files/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.716832 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/controller/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.731846 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-metrics/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.846166 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/cp-reloader/0.log" Feb 18 06:53:36 crc kubenswrapper[4869]: I0218 06:53:36.968715 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/frr-metrics/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.007111 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/kube-rbac-proxy/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.056237 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/kube-rbac-proxy-frr/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.241376 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/reloader/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.288316 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-r6mvd_c67c84c4-b4c3-4336-a6b9-3543258cea17/frr-k8s-webhook-server/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.471815 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:53:37 crc kubenswrapper[4869]: E0218 06:53:37.472322 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lv6qh_openshift-machine-config-operator(781aec66-5fc7-4161-a704-cc78830d525d)\"" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.480408 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerStarted","Data":"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914"} Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.714278 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55df77c686-fqtt5_74eef01f-c0d7-449c-bca9-7eb78f808110/manager/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.816483 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69666c74dd-pv6sz_d597c072-fd48-4245-8a9a-5a80aaa78993/webhook-server/0.log" Feb 18 06:53:37 crc kubenswrapper[4869]: I0218 06:53:37.936954 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v8dwx_f63c8d8a-ba54-4ddf-9105-5f886b7984d9/kube-rbac-proxy/0.log" Feb 18 06:53:38 crc kubenswrapper[4869]: I0218 06:53:38.331899 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h4mbr_24b94d41-cd3f-4c37-86a1-0ed957404bab/frr/0.log" Feb 18 06:53:38 crc kubenswrapper[4869]: I0218 06:53:38.442500 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v8dwx_f63c8d8a-ba54-4ddf-9105-5f886b7984d9/speaker/0.log" Feb 18 06:53:40 crc kubenswrapper[4869]: E0218 06:53:40.774036 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f145d2_6113_4580_8c49_2b73f4f13d68.slice/crio-conmon-d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914.scope\": RecentStats: unable to find data in memory cache]" Feb 18 06:53:41 crc kubenswrapper[4869]: I0218 06:53:41.502848 4869 generic.go:334] "Generic (PLEG): container finished" podID="53f145d2-6113-4580-8c49-2b73f4f13d68" containerID="d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914" exitCode=0 Feb 18 06:53:41 crc kubenswrapper[4869]: I0218 06:53:41.502915 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerDied","Data":"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914"} Feb 18 06:53:42 crc kubenswrapper[4869]: I0218 06:53:42.513351 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerStarted","Data":"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a"} Feb 18 06:53:42 crc kubenswrapper[4869]: I0218 06:53:42.538700 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dv525" podStartSLOduration=1.995289542 podStartE2EDuration="7.538684985s" podCreationTimestamp="2026-02-18 06:53:35 +0000 UTC" firstStartedPulling="2026-02-18 06:53:36.411840096 +0000 UTC m=+3913.580928318" lastFinishedPulling="2026-02-18 06:53:41.955235519 +0000 UTC m=+3919.124323761" observedRunningTime="2026-02-18 06:53:42.533266294 +0000 UTC m=+3919.702354526" watchObservedRunningTime="2026-02-18 06:53:42.538684985 +0000 UTC m=+3919.707773217" Feb 18 06:53:45 crc kubenswrapper[4869]: I0218 06:53:45.468342 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:45 crc kubenswrapper[4869]: I0218 06:53:45.468850 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:46 crc kubenswrapper[4869]: I0218 06:53:46.520324 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dv525" podUID="53f145d2-6113-4580-8c49-2b73f4f13d68" containerName="registry-server" probeResult="failure" output=< Feb 18 06:53:46 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Feb 18 06:53:46 crc kubenswrapper[4869]: > Feb 18 06:53:48 crc kubenswrapper[4869]: I0218 06:53:48.470519 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:53:49 crc kubenswrapper[4869]: I0218 06:53:49.574699 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"f75f21fd08de68660042ebd0d3b6e4f6d83dc2fcd399d5f6e97a589d459d3ff1"} Feb 18 06:53:50 crc kubenswrapper[4869]: I0218 06:53:50.833519 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:53:50 crc kubenswrapper[4869]: I0218 06:53:50.998920 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.063540 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.108366 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.379210 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/util/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.389199 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/extract/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.419891 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213st75d_1730c713-0f10-4294-a20d-c7d7a2d4403f/pull/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.549453 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.763235 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.778652 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:53:51 crc kubenswrapper[4869]: I0218 06:53:51.817199 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.069435 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-utilities/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.077811 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/extract-content/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.456645 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.474779 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d6h99_7d008101-ce8c-46ba-986d-a6c77a268c0b/registry-server/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.657552 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.694329 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.728522 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.958184 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-utilities/0.log" Feb 18 06:53:52 crc kubenswrapper[4869]: I0218 06:53:52.964666 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/extract-content/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.184709 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.201779 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfdnd_dd620fad-d652-4a98-95d7-8470686b4219/registry-server/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.397402 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.410684 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.431753 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.629213 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/util/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.649722 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/pull/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.671471 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatv4w7_dd932d93-4a7d-4779-8833-23887167b576/extract/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.823994 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-997j2_89d643e7-cad7-4856-9d82-c0370e1f20e5/marketplace-operator/0.log" Feb 18 06:53:53 crc kubenswrapper[4869]: I0218 06:53:53.902310 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.096648 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.099366 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.114324 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.308405 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-utilities/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.434326 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/extract-content/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.504485 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-utilities/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.551643 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnzzv_d26ef514-863b-48dd-8576-d68036b43bf6/registry-server/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.818830 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-content/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.847066 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-utilities/0.log" Feb 18 06:53:54 crc kubenswrapper[4869]: I0218 06:53:54.886521 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-content/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.076458 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-utilities/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.078614 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/extract-content/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.132583 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dv525_53f145d2-6113-4580-8c49-2b73f4f13d68/registry-server/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.253139 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.414246 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.457551 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.481179 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.531587 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.609051 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.667678 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-content/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.672968 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/extract-utilities/0.log" Feb 18 06:53:55 crc kubenswrapper[4869]: I0218 06:53:55.778476 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:56 crc kubenswrapper[4869]: I0218 06:53:56.265253 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j6klp_83fb8d7a-66e5-44f7-8b25-9b88e3b2ff22/registry-server/0.log" Feb 18 06:53:56 crc kubenswrapper[4869]: I0218 06:53:56.628728 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dv525" podUID="53f145d2-6113-4580-8c49-2b73f4f13d68" containerName="registry-server" containerID="cri-o://aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a" gracePeriod=2 Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.122152 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.206195 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66ddd\" (UniqueName: \"kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd\") pod \"53f145d2-6113-4580-8c49-2b73f4f13d68\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.206322 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content\") pod \"53f145d2-6113-4580-8c49-2b73f4f13d68\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.206387 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities\") pod \"53f145d2-6113-4580-8c49-2b73f4f13d68\" (UID: \"53f145d2-6113-4580-8c49-2b73f4f13d68\") " Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.207364 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities" (OuterVolumeSpecName: "utilities") pod "53f145d2-6113-4580-8c49-2b73f4f13d68" (UID: "53f145d2-6113-4580-8c49-2b73f4f13d68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.211820 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd" (OuterVolumeSpecName: "kube-api-access-66ddd") pod "53f145d2-6113-4580-8c49-2b73f4f13d68" (UID: "53f145d2-6113-4580-8c49-2b73f4f13d68"). InnerVolumeSpecName "kube-api-access-66ddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.308182 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.308219 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66ddd\" (UniqueName: \"kubernetes.io/projected/53f145d2-6113-4580-8c49-2b73f4f13d68-kube-api-access-66ddd\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.328989 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53f145d2-6113-4580-8c49-2b73f4f13d68" (UID: "53f145d2-6113-4580-8c49-2b73f4f13d68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.409598 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f145d2-6113-4580-8c49-2b73f4f13d68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.638964 4869 generic.go:334] "Generic (PLEG): container finished" podID="53f145d2-6113-4580-8c49-2b73f4f13d68" containerID="aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a" exitCode=0 Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.639073 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dv525" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.640554 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerDied","Data":"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a"} Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.640702 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dv525" event={"ID":"53f145d2-6113-4580-8c49-2b73f4f13d68","Type":"ContainerDied","Data":"9480655eeda442d85f9f1100b43bf378de430ac243dbdfe731bf13275e0da451"} Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.640766 4869 scope.go:117] "RemoveContainer" containerID="aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.662967 4869 scope.go:117] "RemoveContainer" containerID="d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.670538 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.684179 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dv525"] Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.697047 4869 scope.go:117] "RemoveContainer" containerID="e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.721427 4869 scope.go:117] "RemoveContainer" containerID="aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a" Feb 18 06:53:57 crc kubenswrapper[4869]: E0218 06:53:57.721971 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a\": container with ID starting with aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a not found: ID does not exist" containerID="aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.722003 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a"} err="failed to get container status \"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a\": rpc error: code = NotFound desc = could not find container \"aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a\": container with ID starting with aa47cdf38aced90378a97342b7497e2b0b4805a44f3f444634a41d3ef4e21c1a not found: ID does not exist" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.722022 4869 scope.go:117] "RemoveContainer" containerID="d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914" Feb 18 06:53:57 crc kubenswrapper[4869]: E0218 06:53:57.722430 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914\": container with ID starting with d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914 not found: ID does not exist" containerID="d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.722454 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914"} err="failed to get container status \"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914\": rpc error: code = NotFound desc = could not find container \"d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914\": container with ID starting with d97e01e40f18fce9d94bc4b69f5d76d96fb99946c5fab91c7a6b680365a64914 not found: ID does not exist" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.722494 4869 scope.go:117] "RemoveContainer" containerID="e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71" Feb 18 06:53:57 crc kubenswrapper[4869]: E0218 06:53:57.723106 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71\": container with ID starting with e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71 not found: ID does not exist" containerID="e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71" Feb 18 06:53:57 crc kubenswrapper[4869]: I0218 06:53:57.723165 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71"} err="failed to get container status \"e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71\": rpc error: code = NotFound desc = could not find container \"e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71\": container with ID starting with e226e3aa93ae00bdec8b769134653f4cbcee3341816a09fb592ee89be06f2f71 not found: ID does not exist" Feb 18 06:53:59 crc kubenswrapper[4869]: I0218 06:53:59.484547 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f145d2-6113-4580-8c49-2b73f4f13d68" path="/var/lib/kubelet/pods/53f145d2-6113-4580-8c49-2b73f4f13d68/volumes" Feb 18 06:54:23 crc kubenswrapper[4869]: E0218 06:54:23.502026 4869 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.50:41864->38.102.83.50:44207: write tcp 38.102.83.50:41864->38.102.83.50:44207: write: broken pipe Feb 18 06:55:46 crc kubenswrapper[4869]: I0218 06:55:46.684026 4869 generic.go:334] "Generic (PLEG): container finished" podID="295226a5-0fdf-44b2-aed1-22bef38de348" containerID="d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8" exitCode=0 Feb 18 06:55:46 crc kubenswrapper[4869]: I0218 06:55:46.684134 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" event={"ID":"295226a5-0fdf-44b2-aed1-22bef38de348","Type":"ContainerDied","Data":"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8"} Feb 18 06:55:46 crc kubenswrapper[4869]: I0218 06:55:46.685121 4869 scope.go:117] "RemoveContainer" containerID="d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8" Feb 18 06:55:47 crc kubenswrapper[4869]: I0218 06:55:47.066137 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8bgt_must-gather-6wpzm_295226a5-0fdf-44b2-aed1-22bef38de348/gather/0.log" Feb 18 06:55:57 crc kubenswrapper[4869]: I0218 06:55:57.740170 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8bgt/must-gather-6wpzm"] Feb 18 06:55:57 crc kubenswrapper[4869]: I0218 06:55:57.741067 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" podUID="295226a5-0fdf-44b2-aed1-22bef38de348" containerName="copy" containerID="cri-o://4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b" gracePeriod=2 Feb 18 06:55:57 crc kubenswrapper[4869]: I0218 06:55:57.750469 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8bgt/must-gather-6wpzm"] Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.143790 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8bgt_must-gather-6wpzm_295226a5-0fdf-44b2-aed1-22bef38de348/copy/0.log" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.144513 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.295862 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vqs4\" (UniqueName: \"kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4\") pod \"295226a5-0fdf-44b2-aed1-22bef38de348\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.295980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output\") pod \"295226a5-0fdf-44b2-aed1-22bef38de348\" (UID: \"295226a5-0fdf-44b2-aed1-22bef38de348\") " Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.301839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4" (OuterVolumeSpecName: "kube-api-access-6vqs4") pod "295226a5-0fdf-44b2-aed1-22bef38de348" (UID: "295226a5-0fdf-44b2-aed1-22bef38de348"). InnerVolumeSpecName "kube-api-access-6vqs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.398380 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vqs4\" (UniqueName: \"kubernetes.io/projected/295226a5-0fdf-44b2-aed1-22bef38de348-kube-api-access-6vqs4\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.439473 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "295226a5-0fdf-44b2-aed1-22bef38de348" (UID: "295226a5-0fdf-44b2-aed1-22bef38de348"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.500340 4869 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/295226a5-0fdf-44b2-aed1-22bef38de348-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.813240 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8bgt_must-gather-6wpzm_295226a5-0fdf-44b2-aed1-22bef38de348/copy/0.log" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.814038 4869 generic.go:334] "Generic (PLEG): container finished" podID="295226a5-0fdf-44b2-aed1-22bef38de348" containerID="4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b" exitCode=143 Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.814082 4869 scope.go:117] "RemoveContainer" containerID="4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.814098 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8bgt/must-gather-6wpzm" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.833016 4869 scope.go:117] "RemoveContainer" containerID="d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.877728 4869 scope.go:117] "RemoveContainer" containerID="4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b" Feb 18 06:55:58 crc kubenswrapper[4869]: E0218 06:55:58.878273 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b\": container with ID starting with 4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b not found: ID does not exist" containerID="4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.878346 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b"} err="failed to get container status \"4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b\": rpc error: code = NotFound desc = could not find container \"4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b\": container with ID starting with 4555bcbb65e3e36ef73a490d07e6d5f26b1c0e9e501164cf7932fc2bf03e7a1b not found: ID does not exist" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.878400 4869 scope.go:117] "RemoveContainer" containerID="d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8" Feb 18 06:55:58 crc kubenswrapper[4869]: E0218 06:55:58.878904 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8\": container with ID starting with d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8 not found: ID does not exist" containerID="d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8" Feb 18 06:55:58 crc kubenswrapper[4869]: I0218 06:55:58.878981 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8"} err="failed to get container status \"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8\": rpc error: code = NotFound desc = could not find container \"d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8\": container with ID starting with d007828f6f0162f1e15091dda765fc1f941c48f2cad4ea823c138de403dff0f8 not found: ID does not exist" Feb 18 06:55:59 crc kubenswrapper[4869]: I0218 06:55:59.484502 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295226a5-0fdf-44b2-aed1-22bef38de348" path="/var/lib/kubelet/pods/295226a5-0fdf-44b2-aed1-22bef38de348/volumes" Feb 18 06:56:10 crc kubenswrapper[4869]: I0218 06:56:10.132992 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:56:10 crc kubenswrapper[4869]: I0218 06:56:10.133710 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:56:32 crc kubenswrapper[4869]: I0218 06:56:32.955201 4869 scope.go:117] "RemoveContainer" containerID="e0555d1f98cacd6af6a1ac2cc5e1b09722570eb441c4e788d17076f78c70ee3e" Feb 18 06:56:40 crc kubenswrapper[4869]: I0218 06:56:40.132884 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:56:40 crc kubenswrapper[4869]: I0218 06:56:40.134898 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.132593 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.133512 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.133608 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.135063 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f75f21fd08de68660042ebd0d3b6e4f6d83dc2fcd399d5f6e97a589d459d3ff1"} pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.135197 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" containerID="cri-o://f75f21fd08de68660042ebd0d3b6e4f6d83dc2fcd399d5f6e97a589d459d3ff1" gracePeriod=600 Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.419087 4869 generic.go:334] "Generic (PLEG): container finished" podID="781aec66-5fc7-4161-a704-cc78830d525d" containerID="f75f21fd08de68660042ebd0d3b6e4f6d83dc2fcd399d5f6e97a589d459d3ff1" exitCode=0 Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.419164 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerDied","Data":"f75f21fd08de68660042ebd0d3b6e4f6d83dc2fcd399d5f6e97a589d459d3ff1"} Feb 18 06:57:10 crc kubenswrapper[4869]: I0218 06:57:10.419337 4869 scope.go:117] "RemoveContainer" containerID="cc963804b09d1fff5ed53b9b44117e39ca677136cc9057d1ec692c139961aaae" Feb 18 06:57:11 crc kubenswrapper[4869]: I0218 06:57:11.429168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" event={"ID":"781aec66-5fc7-4161-a704-cc78830d525d","Type":"ContainerStarted","Data":"f519c948be665c57e456cf16828771a7f420e6562aee8a697fe4e5234b885e04"} Feb 18 06:59:10 crc kubenswrapper[4869]: I0218 06:59:10.133307 4869 patch_prober.go:28] interesting pod/machine-config-daemon-lv6qh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:59:10 crc kubenswrapper[4869]: I0218 06:59:10.134924 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lv6qh" podUID="781aec66-5fc7-4161-a704-cc78830d525d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145261543024453 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145261543017370 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145250701016505 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145250702015456 5ustar corecore